# Multiresolution Forecasting for Industrial Applications

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

- An open-source and application-oriented wavelet framework combined with an automatic model selection through differential evolutionary optimization with standardized access in Python and R.
- Contrary to prior works, a systematic comparison of state-of-the-art methods and open-source accessible seasonal univariate forecasting methods to our framework.
- Wavelet forecasting performs equally well on short-term and long-term forecasting.

## 2. Materials and Methods

#### 2.1. Related Work

#### 2.2. Rolling Forecast

- Split time series into training and test set:
- Training: 1, …, k + i − 1.
- Evaluation: k + h + i − 1.

- Fit the forecasting method on the training dataset and compute the error of the resulting forecasting model for k + h + i − 1 on the evaluation dataset.
- Repeat the above steps for i = 1, …, k + h + i − 1, with t being the total length of the time series.
- Compute a quality measure based on the errors obtained from the previous steps.

#### 2.3. Datasets

#### 2.4. Mean Average Scaled Error

#### 2.5. Symmetric Mean Absolute Percentage Error

#### 2.6. MD Plot

#### 2.7. Multiresolution Method

## 3. Results

## 4. Discussion

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## Appendix A. Data Visualization

**Figure A1.**Data visualization of the four time series used in this work with the measurement unit: (

**a**) Callcenter (number of issues per day), (

**b**) Electricity (daily electricity demand in Germany MW/H), (

**c**) Prices (daily system price in €) and (

**d**) Stocks (American Airline Index taken from the SAP500).

## Appendix B. SMAPE

**Figure A2.**Mirrored Density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin for horizon 1 on dataset Callcenter.

**Figure A3.**Mirrored density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin for horizon 1 on dataset Electricity.

**Figure A4.**Mirrored density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin for horizon 1 on dataset Prices.

**Figure A5.**Mirrored density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin for horizon 1 on dataset Stocks.

**Figure A6.**Mirrored Density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin over all horizons on dataset Callcenter.

**Figure A7.**Mirrored density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin over all horizons on dataset Electricity. The estimated distributions showing a magenta marking are approximately gaussian distributed [44].

**Figure A8.**Mirrored density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin over all horizons on dataset Prices.

**Figure A9.**Mirrored density plot of Symmetric Mean Absolute Percentage Errors (SMAPE) computed with a rolling forecasting origin over all horizons on dataset Stocks. The estimated distributions showing a magenta marking are approximately gaussian distributed [44].

## Appendix C. Statistical Tests

**Table A1.**The Kolmogorov–Smirnov statistic for testing if the forecast error with MASE follows an F distribution for all datasets and forecasts for horizon = 1.

AA | LSTM | MAPA | MLP | MRNN | MRR | NN | Prophet | SA | MRANN | MRAR | XGB | |
---|---|---|---|---|---|---|---|---|---|---|---|---|

Elec. | 0.05 | 0.0 | 0.76 | 0.05 | 0.4 | 0.7 | 0.0 | 0.31 | 0.31 | 0.24 | 0.01 | 0.01 |

Call. | 0.21 | 0.29 | 0.92 | 0.5 | 0.91 | 0.51 | 0.96 | 0.23 | 0.62 | 0.32 | 0.63 | 0.03 |

Prices | 0.76 | 0.3 | 0.56 | 0.94 | 0.38 | 0.93 | 0.74 | 0.19 | 0.5 | 0.43 | 0.24 | 0.9 |

Stocks | 0.0 | 0.99 | 0.9 | 0.79 | 0.39 | 0.77 | 0.71 | 0.29 | 0.0 | 0.22 | 0.85 | 0.68 |

**Table A2.**The Kolmogorov–Smirnov statistic for testing if the forecast error with MASE follows an F distribution for all datasets and all forecasts for horizons > 1.

AA | LSTM | MAPA | MLP | MRNN | MRR | NN | Prophet | SA | MRANN | MRAR | XGB | |
---|---|---|---|---|---|---|---|---|---|---|---|---|

Elec. | 0.53 | 0.0 | 0.78 | 0.98 | 0.98 | 0.61 | 0.02 | 0.19 | 0.44 | 0.23 | 0.88 | 0.77 |

Call. | 0.18 | 0.89 | 0.38 | 0.52 | 0.72 | 0.47 | 0.49 | 0.08 | 0.12 | 0.6 | 0.8 | 1.0 |

Prices | 0.47 | 0.88 | 0.05 | 0.67 | 1.0 | 0.47 | 0.77 | 0.12 | 0.84 | 0.94 | 0.61 | 0.69 |

Stocks | 0.97 | 0.84 | 0.43 | 0.9 | 0.69 | 0.75 | 0.48 | 0.05 | 0.81 | 0.89 | 0.88 | 0.98 |

## Appendix D. QQ Plots

**Figure A10.**QQ plot between MRR and Prophet on dataset Callcenter with quality measure MASE overall horizons.

**Figure A11.**QQ plot between MRNN and Prophet on dataset Callcenter with quality measure MASE overall horizons.

**Figure A12.**QQ plot between MRR and Prophet on dataset Electricity with quality measure MASE overall horizons.

**Figure A13.**QQ plot between MRNN and Prophet on dataset Electricity with quality measure MASE overall horizons.

## Appendix E. SMAPE Table

**Table A3.**Median SMAPE of multi-step forecasts from Prophet, MAPA, the two multiresolution forecasting methods MRR using regression and MRNN using a neural network, seasonal adapted ARIMA (SA), automatized ARIMA (AA), multilayer perceptron (MLP), long short-term memory (LSTM) and gradient boosting machine (XGB) for different horizons and for the median over horizons from 1 to 7 and 1 to 14. The forecasting techniques denoted with a start (*) did yield a non-significant result for unimodal distributions Hartigans’ dip test with Bonferroni correction meaning that otherwise the distribution is assumed to be non-unimodal.

Dataset Callcenter | |||||||||
---|---|---|---|---|---|---|---|---|---|

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 36.0 | 33.1 | 34.4 | 35.5 | 32.7 | 35.4 | 36.4 | 34.6 | 36.0 |

LSTM | 29.1 | 27.4 | 27.4 | 26.2 | 28.0 | 28.0 | 29.0 | 27.8 | 28.3 |

MAPA | 19.8 | 21.1 | 21.4 | 21.1 | 21.8 | 21.9 | 23.3 | 21.3 | 22.7 |

MLP | 25.3 | 24.5 | 23.8 | 23.5 | 25.2 | 24.2 | 24.9 | 24.3 | 24.4 |

MRNN | 9.6 | 11.3 | 11.8 | 12.8 | 13.9 | 14.0 | 20.7 | 12.3 | 15.1 |

MRR | 13.6 | 13.9 | 14.3 | 14.8 | 14.4 | 15.8 | 20.2 | 14.3 | 16.5 |

Prophet | 14.3 | 14.5 | 14.4 | 14.3 | 14.6 | 15.0 | 15.5 | 14.4 | 14.7 |

SA | 34.2 | 35.6 | 33.8 | 34.7 | 30.9 | 33.9 | 45.5 | 33.7 | 36.7 |

XGB | 35.4 | 55.1 | 52.6 | 61.5 | 64.0 | 65.1 | 64.8 | 58.0 | 62.0 |

NN | 11.4 | 16.1 | 55.5 * | 54.7 * | 63.0 | 43.7 | 73.5 | 46.3 * | 52.5 |

MRANN | 7.7 | 10.1 | 10.5 | 11.5 | 12.1 | 13.3 | 17.3 | 11.2 | 13.3 |

MRArima | 28.6 | 34.0 | 29.8 | 36.6 | 31.2 | 34.5 | 36.1 | 32.6 | 35.0 |

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 4.6 | 4.6 | 4.5 | 4.3 | 4.1 | 4.7 | 7.6 * | 4.5 | 5.5 |

LSTM | 10.1 * | 11.8 * | 11.6 * | 12.0 * | 11.5 * | 11.3 * | 11.9 * | 11.4 | 11.3 |

MAPA | 1.3 | 1.5 | 1.5 | 1.7 | 1.8 | 2.0 | 2.4 | 1.6 | 1.9 |

MLP | 5.5 | 5.8 | 5.4 | 5.7 | 5.9 | 5.5 | 5.2 | 5.6 | 5.6 |

MRNN | 1.0 | 1.3 | 1.5 | 1.8 | 2.1 | 2.1 | 3.1 | 1.6 | 2.1 |

MRR | 1.3 | 1.7 | 1.8 | 1.9 | 1.9 | 1.9 | 2.2 | 1.7 | 1.9 |

Prophet | 1.8 | 1.8 | 1.8 | 1.8 | 1.9 | 1.9 | 1.8 | 1.8 | 1.8 |

SA | 5.3 | 6.8 | 6.6 | 6.7 | 6.2 | 5.9 | 8.1 | 6.3 | 6.9 |

XGB | 4.9 | 11.2 | 11.6 | 11.4 | 11.7 | 10.7 | 12.1 | 11.0 | 11.3 |

NN | 1.0 | 1.4 | 14.6 * | 14.8 * | 15.3 * | 4.5 | 13.6 * | 4.0 | 6.6 * |

MRANN | 0.9 | 1.3 | 1.4 | 1.6 | 1.8 | 1.9 | 2.6 | 1.5 | 1.8 |

MRArima | 5.9 | 8.0 | 8.4 | 9.7 | 7.5 | 7.3 | 8.4 | 8.0 | 8.5 |

**Table A4.**Median SMAPE of multi-step forecasts from Prophet, MAPA, the two multiresolution forecasting methods MRR using regression and MRNN using a neural network, seasonal adapted ARIMA (SA), automatized ARIMA (AA), multilayer perceptron (MLP), long short-term memory (LSTM) and gradient boosting machine (XGB) for different horizons and for the median over horizons from 1 to 7 and 1 to 14. The forecasting techniques denoted with a start (*) did yield a non-significant result for unimodal distributions Hartigans’ dip test with Bonferroni correction meaning that otherwise the distribution is assumed to be non-unimodal. All Hartigans’ Test in this table yielded non-significant results.

Dataset Stocks | |||||||||
---|---|---|---|---|---|---|---|---|---|

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 1.0 | 1.6 | 2.1 | 2.6 | 3.4 | 3.6 | 5.7 | 2.3 | 3.4 |

LSTM | 1.3 | 1.5 | 1.4 | 1.4 | 1.5 | 1.4 | 1.4 | 1.4 | 1.4 |

MAPA | 1.0 | 1.5 | 2.1 | 2.5 | 3.1 | 3.5 | 4.7 | 2.2 | 3.0 |

MLP | 1.3 | 1.2 | 1.3 | 1.3 | 1.4 | 1.3 | 1.3 | 1.3 | 1.3 |

MRNN | 1.0 | 1.7 | 2.2 | 2.7 | 3.6 | 4.0 | 5.9 | 2.4 | 3.6 |

MRR | 1.1 | 1.8 | 2.3 | 2.9 | 3.8 | 4.4 | 7.0 | 2.5 | 3.8 |

Prophet | 3.9 | 5.0 | 5.1 | 5.3 | 5.2 | 5.2 | 6.5 | 5.0 | 5.7 |

SA | 1.1 | 1.6 | 2.0 | 2.5 | 3.3 | 3.6 | 5.3 | 2.3 | 3.2 |

XGB | 1.7 | 2.6 | 3.1 | 3.6 | 4.3 | 4.5 | 6.2 | 3.2 | 4.2 |

NN | 1.1 | 1.5 | 2.0 | 2.5 | 3.3 | 3.5 | 5.1 | 2.3 | 3.2 |

MRANN | 1.1 | 1.5 | 2.2 | 2.7 | 3.5 | 4.2 | 6.1 | 2.4 | 3.5 |

MRArima | 1.1 | 1.7 | 2.1 | 2.7 | 3.4 | 3.8 | 5.6 | 2.3 | 3.5 |

Dataset Price | |||||||||

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 4.6 | 6.2 | 6.5 | 6.8 | 7.3 | 7.5 | 10.2 | 6.4 | 7.8 |

LSTM | 4.8 | 5.0 | 5.0 | 4.9 | 4.6 | 4.6 | 4.7 | 4.8 | 4.7 |

MAPA | 3.8 | 4.9 | 5.6 | 5.8 | 6.4 | 7.1 | 9.3 | 5.5 | 6.8 |

MLP | 4.3 | 4.2 | 3.9 | 4.3 | 4.1 | 4.2 | 4.3 | 4.1 | 4.2 |

MRNN | 3.6 | 5.5 | 5.8 | 6.5 | 7.5 | 7.9 | 10.3 | 6.2 | 7.7 |

MRR | 3.8 | 5.6 | 6.0 | 6.2 | 7.4 | 7.5 | 9.7 | 6.3 | 7.4 |

Prophet | 19.2 | 19.6 | 19.7 | 20.3 | 21.1 | 21.2 | 23.1 | 20.3 | 21.4 |

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

SA | 4.4 | 5.9 | 6.2 | 6.0 | 7.0 | 7.6 | 9.6 | 6.1 | 7.4 |

XGB | 5.9 | 9.3 | 11.4 | 10.9 | 12.8 | 14.3 | 19.4 | 10.5 | 13.4 |

NN | 3.9 | 6.4 | 7.1 | 6.8 | 6.6 | 6.3 | 9.1 | 6.2 | 7.4 |

MRANN | 3.5 | 5.5 | 6.3 | 7.0 | 7.4 | 7.7 | 10.5 | 6.3 | 7.7 |

MRArima | 4.3 | 5.9 | 6.5 | 6.3 | 7.4 | 7.4 | 9.8 | 6.4 | 7.7 |

## References

- Box, G.E.; Hillmer, S.C.; Tiao, G.C. Analysis and modeling of seasonal time series. In Seasonal Analysis of Economic Time Series; NBER: Cambridge, MA, USA, 1978; pp. 309–344. [Google Scholar]
- Shiskin, J.; Eisenpress, H. Seasonal adjustments by electronic computer methods. J. Am. Stat. Assoc.
**1957**, 52, 415–449. [Google Scholar] [CrossRef] - Abeln, B.; Jacobs, J.P.A.M.; Ouwehand, P. CAMPLET: Seasonal Adjustment Without Revisions. J. Bus. Cycle Res.
**2019**, 15, 73–95. [Google Scholar] [CrossRef] [Green Version] - De Gooijer, J.G.; Hyndman, R.J. 25 years of time series forecasting. Int. J. Forecast.
**2006**, 22, 443–473. [Google Scholar] [CrossRef] [Green Version] - Štěpnička, M.; Cortez, P.; Donate, J.P.; Štěpničková, L. Forecasting seasonal time series with computational intelligence: On recent methods and the potential of their combinations. Expert Syst. Appl.
**2013**, 40, 1981–1992. [Google Scholar] [CrossRef] [Green Version] - Taylor, S.J.; Letham, B. Forecasting at scale. Am. Stat.
**2018**, 72, 37–45. [Google Scholar] [CrossRef] - Daubechies, I. Ten Lectures on Wavelets, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 1992. [Google Scholar] [CrossRef]
- Shannon, C.E. Communication in the presence of noise. Proc. IRE
**1949**, 37, 10–21. [Google Scholar] [CrossRef] - Kourentzes, N.; Petropoulos, F.; Trapero, J.R. Improving Forecasting by Estimating Time Series Structural Components across Multiple Frequencies. Int. J. Forecast.
**2014**, 30, 593. [Google Scholar] [CrossRef] [Green Version] - Kourentzes, N.; Petropoulos, F. Forecasting with Multivariate Temporal Aggregation: The case of Promotional Modelling. Int. J. Prod. Econ.
**2016**, 181, 298. [Google Scholar] [CrossRef] [Green Version] - Aussem, A.; Campbell, J.; Murtagh, F. Waveletbased Feature Extraction and Decomposition Strategies for Financial Forecasting. Int. J. Comput. Intell. Financ.
**1998**, 6, 5–12. [Google Scholar] - Gonghui, Z.; Starck, J.L.; Campbell, J.; Murtagh, F. The Wavelet Transform for Filtering Financial Data Streams. J. Comput. Intell. Financ.
**1999**, 7, 18–35. [Google Scholar] - Renaud, O.; Starck, J.L.; Murtagh, F. Prediction based on a Multiscale Decomposition. Int. J. Wavelets Multiresolution Inf. Process.
**2003**, 1, 217–232. [Google Scholar] [CrossRef] [Green Version] - Renaud, O.; Starck, J.L.; Murtagh, F. Wavelet-based combined Signal Filtering and Prediction. IEEE Trans. Syst. Man Cybern. Part B
**2005**, 35, 1241–1251. [Google Scholar] [CrossRef] [PubMed] - Benaouda, D.; Murtagh, F.; Starck, J.L.; Renaud, O. Wavelet-based Nonlinear Multiscale Decomposition Model for Electricity Load Forecasting. Neurocomputing
**2006**, 70, 139–154. [Google Scholar] [CrossRef] - Murtagh, F.; Starck, J.L.; Renaud, O. On Neuro-Wavelet Modeling. Decis. Support Syst.
**2004**, 37, 475–484. [Google Scholar] [CrossRef] - Stier, Q. MRFPY: Multiresolution Forecasting in Python. 2021. Available online: https://pypi.org/project/MRFPY/ (accessed on 9 August 2021).
- Stier, Q. MRFR: Multiresolution Forecasting in R, 2021. R Package Version 0.4.2. 2021. Available online: https://cran.r-project.org/web/packages/mrf (accessed on 9 August 2021).
- Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.; Bright, J.; et al. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nat. Methods
**2020**, 17, 261–272. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Lee, G.R.; Gommers, R.; Wasilewski, F.; Wohlfahrt, K.; O’Leary, A. PyWavelets: A Python package for wavelet analysis. J. Open Source Softw.
**2019**, 4, 1237. [Google Scholar] [CrossRef] - Aldrich, E. wavelets: Functions for Computing Wavelet Filters, Wavelet Transforms and Multiresolution Analyses. 2020. Available online: https://CRAN.R-project.org/package=wavelets (accessed on 9 August 2021).
- Bolos, V.J.; Benitez, R. wavScalogram: Wavelet Scalogram Tools for Time Series Analysis. 2021. Available online: https://CRAN.R-project.org/package=wavScalogram (accessed on 9 August 2021).
- Gouhier, T.C.; Grinsted, A.; Simko, V. biwavelet: Conduct Univariate and Bivariate Wavelet Analyses. 2021. Available online: https://CRAN.R-project.org/package=biwavelet (accessed on 9 August 2021).
- Roebuck, P.; Rice University’s DSP Group. rwt: Rice Wavelet Toolbox Wrapper. 2014. Available online: https://CRAN.R-project.org/package=rwt (accessed on 9 August 2021).
- Whitcher, B. waveslim: Basic Wavelet Routines for One-, Two-, and Three-Dimensional Signal Processing. 2020. Available online: https://CRAN.R-project.org/package=waveslim (accessed on 9 August 2021).
- Nason, G. wavethresh: Wavelets Statistics and Transforms. 2016. Available online: https://CRAN.R-project.org/package=wavethresh (accessed on 9 August 2021).
- Nason, G. hwwntest: Tests of White Noise Using Wavelets. 2018. Available online: https://CRAN.R-project.org/package=hwwntest (accessed on 9 August 2021).
- Aminghafari, M.; Poggi, J.M. Forecasting time series using wavelets. Int. J. Wavelets Multiresolut. Inf. Process.
**2007**, 5, 709–724. [Google Scholar] [CrossRef] - Paul, D.R.K. WaveletArima: Wavelet ARIMA Model. 2018. Available online: https://CRAN.R-project.org/package=WaveletArima (accessed on 9 August 2021).
- Anjoy, P.; Paul, R.K. Comparative performance of wavelet-based neural network approaches. Neural Comput. Appl.
**2019**, 31, 3443–3453. [Google Scholar] [CrossRef] - Paul, D.R.K. WaveletANN: Wavelet ANN Model. 2019. Available online: https://CRAN.R-project.org/package=WaveletANN (accessed on 9 August 2021).
- Paul, D.R.K. WaveletGARCH: Fit the Wavelet-GARCH Model to Volatile Time Series Data. 2020. Available online: https://CRAN.R-project.org/package=WaveletGARCH (accessed on 9 August 2021).
- Tashman, L.J. Out-of-sample Tests of Forecasting Accuracy: An Analysis and Review. Int. J. Forecast.
**2000**, 16, 437–450. [Google Scholar] [CrossRef] - Hyndman, R.; Athanasopoulos, G. Forecasting: Principles and Practice, 3rd ed.; OTexts: Melbourne, Australia, 2018. [Google Scholar]
- Hyndman, R.J.; Koehler, A.B. Another look at measures of forecast accuracy. Int. J. Forecast.
**2006**, 22, 679–688. [Google Scholar] [CrossRef] [Green Version] - Makridakis, S.; Andersen, A.; Carbone, R.; Fildes, R.; Hibon, M.; Lewandowski, R.; Newton, J.; Parzen, E.; Winkler, R. The Accuracy of Extrapolation (Time Series) Methods: Results of a Forecasting Competition. J. Forecast.
**1982**, 1, 111–153. [Google Scholar] [CrossRef] - Makridakis, S.; Chatfield, C.; Hibon, M.; Lawrence, M.; Mills, T.; Ord, K.; Simmons, L.F. The M2-Competition: A Real-Time Judgmentally based Forecasting Study. Int. J. Forecast.
**1993**, 9, 5–22. [Google Scholar] [CrossRef] - Makridakis, S.; Hibon, M. The M3-Competition: Results, Conclusions and Implications. Int. J. Forecast.
**2000**, 16, 451–476. [Google Scholar] [CrossRef] - Taieb, S.B.; Bontempi, G.; Atiya, A.F.; Sorjamaa, A. A Review and Comparison of Strategies for Multi-Step Ahead Time Series Forecasting based on the NN5 Forecasting Competition. Expert Syst. Appl.
**2012**, 39, 7067–7083. [Google Scholar] [CrossRef] [Green Version] - Demšar, J. Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res.
**2006**, 7, 1–30. [Google Scholar] - Athanasopoulos, G.; Hyndman, R.J.; Kourentzes, N.; Petropoulos, F. Forecasting with Temporal Hierarchies. Eur. J. Oper. Res.
**2017**, 262, 60–74. [Google Scholar] [CrossRef] [Green Version] - Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M4 Competition: 100,000 time series and 61 forecasting methods. Int. J. Forecast.
**2020**, 36, 54–74. [Google Scholar] [CrossRef] - Weron, R. Electricity price forecasting: A review of the state-of-the-art with a look into the future. Int. J. Forecast.
**2014**, 30, 1030–1081. [Google Scholar] [CrossRef] [Green Version] - Thrun, M.C.; Gehlert, T.; Ultsch, A. Analyzing the Fine Structure of Distributions. PLoS ONE
**2020**, 15, e0238835. [Google Scholar] [CrossRef] - Thrun, M.C.; Ultsch, A. Swarm intelligence for self-organized clustering. Artif. Intell.
**2021**, 290, 103237. [Google Scholar] [CrossRef] - Eurostat. ESS Guidelines on Seasonal Adjustment. In Eurostat Methodologies and Working Papers; Technical Report; Publications Office of the European Union: Luxemburg, 2015. [Google Scholar] [CrossRef]
- Quartier-la Tente, A.; Michalek, A.; Baeyens, B. RJDemetra: Interface to ’JDemetra+’ Seasonal Adjustment Software; R Package Version 0.1.6; R Team: Vienna, Austria, 2020. [Google Scholar]
- Hyndman, R.J.; Khandakar, Y. Automatic time series forecasting: The forecast package for R. J. Stat. Softw.
**2008**, 27, 1–22. [Google Scholar] [CrossRef] [Green Version] - Smith, T.G. Pmdarima. 2021. Available online: https://pypi.org/project/pmdarima/ (accessed on 9 August 2021).
- Hyndman, R.J.; King, M.L.; Pitrun, I.; Billah, B. Local linear forecasts using cubic smoothing splines. Aust. N. Z. J. Stat.
**2005**, 47, 87–99. [Google Scholar] [CrossRef] [Green Version] - Croston, J.D. Forecasting and stock control for intermittent demands. J. Oper. Res. Soc.
**1972**, 23, 289–303. [Google Scholar] [CrossRef] - Kourentzes, N.; Petropoulos, F. MAPA: Multiple Aggregation Prediction Algorithm. R Package Version 2.0.4. 2018. Available online: https://CRAN.R-project.org/package=MAPA (accessed on 9 August 2021).
- Timmermann, A. Forecast Combinations. Handb. Econ. Forecast.
**2006**, 1, 135–196. [Google Scholar] [CrossRef] - Tang, Z.; De Almeida, C.; Fishwick, P.A. Time Series Forecasting using Neural Networks vs. Box-Jenkins Methodology. Simulation
**1991**, 57, 303–310. [Google Scholar] [CrossRef] - Taylor, S.J.; Letham, B. Prophet: Forecasting at Scale. 2021. Available online: https://pypi.org/project/prophet/ (accessed on 9 August 2021).
- Hyndman, R.J.; Billah, B. Unmasking the Theta method. Int. J. Forecast.
**2003**, 19, 287–290. [Google Scholar] [CrossRef] - Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’16), San Francisco, CA, USA, 13–17 August 2016; ACM: New York, NY, USA, 2016; pp. 785–794. [Google Scholar] [CrossRef] [Green Version]
- Friedman, J.H. Greedy function approximation: A gradient boosting machine. Ann. Stat.
**2001**, 29, 1189–1232. [Google Scholar] [CrossRef] - Sachs, L.; Hedderich, J. Angewandte Statistik, 16th ed.; Springer: Berlin/Heidelberg, Germany, 2002. [Google Scholar] [CrossRef]
- Kuster, C.; Rezgui, Y.; Mourshed, M. Electrical load forecasting models: A critical systematic review. Sustain. Cities Soc.
**2017**, 35, 257–270. [Google Scholar] [CrossRef] - Sezer, O.B.; Gudelek, M.U.; Ozbayoglu, A.M. Financial time series forecasting with deep learning: A systematic literature review: 2005–2019. Appl. Soft Comput.
**2020**, 90, 106181. [Google Scholar] [CrossRef] [Green Version] - Ibrahim, R.; Ye, H.; L’Ecuyer, P.; Shen, H. Modeling and forecasting call center arrivals: A literature survey and a case study. Int. J. Forecast.
**2016**, 32, 865–874. [Google Scholar] [CrossRef] [Green Version] - Dupernex, S. Why might share prices follow a random walk. Stud. Econ. Rev.
**2007**, 21, 167–179. [Google Scholar] [CrossRef] - European Network of Transmission System Operators for Electricity. 2019. Available online: https://www.entsoe.eu/data/power-stats/ (accessed on 20 August 2020).
- Standard and Poors 500. 2017. Available online: https://de.finance.yahoo.com/ (accessed on 20 August 2020).
- Nordpoolgroup. 2019. Available online: https://www.nordpoolgroup.com/historical-market-data/ (accessed on 24 August 2020).
- Armstrong, J.S. Long-Range Forecasting: From Crystal Ball to Computer, 2nd ed.; Wiley: Hoboken, NJ, USA, 1985. [Google Scholar] [CrossRef]
- Ultsch, A. Pareto Density Estimation: A Density Estimation for Knowledge Discovery. In Innovations in Classification, Data Science and Information Systems; Springer: Berlin/Heidelberg, Germany, 2005; pp. 91–100. [Google Scholar] [CrossRef]
- Starck, J.L.; Murtagh, F. Astronomical Image and Data Analysis, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar] [CrossRef]
- Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification, 2nd ed.; Wiley: Hoboken, NJ, USA, 2012. [Google Scholar]
- Akaike, H. A New Look at the Statistical Model Identification. IEEE Trans. Autom. Control
**1974**, 19, 716–723. [Google Scholar] [CrossRef] - Dodge, Y. The Oxford Dictionary of Statistical Terms, 1st ed.; Oxford University Press: Oxford, UK, 2006. [Google Scholar]
- Eiben, A.E.; Smith, J.E. Introduction to Evolutionary Computing, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2003. [Google Scholar] [CrossRef]
- Kim, H.j.; Shin, K.s. A hybrid approach based on neural networks and genetic algorithms for detecting temporal patterns in stock markets. Appl. Soft Comput.
**2007**, 7, 569–576. [Google Scholar] [CrossRef] - Donate, J.P.; Cortez, P. Evolutionary optimization of sparsely connected and time-lagged neural networks for time series forecasting. Appl. Soft Comput.
**2014**, 23, 432–443. [Google Scholar] [CrossRef] [Green Version] - Zhang, F.; Deb, C.; Lee, S.E.; Yang, J.; Shah, K.W. Time series forecasting for building energy consumption using weighted Support Vector Regression with differential evolution optimization technique. Energy Build.
**2016**, 126, 94–103. [Google Scholar] [CrossRef] - Ragulskis, M.; Lukoseviciute, K.; Navickas, Z.; Palivonaite, R. Short-term time series forecasting based on the identification of skeleton algebraic sequences. Neurocomputing
**2011**, 74, 1735–1747. [Google Scholar] [CrossRef] - Palivonaite, R.; Ragulskis, M. Short-term time series algebraic forecasting with internal smoothing. Neurocomputing
**2014**, 127, 161–171. [Google Scholar] [CrossRef] - Breiman, L. Statistical modeling: The two cultures (with comments and a rejoinder by the author). Stat. Sci.
**2001**, 16, 199–231. [Google Scholar] [CrossRef] - Ben-David, S.; Hrubeš, P.; Moran, S.; Shpilka, A.; Yehudayoff, A. Learnability can be undecidable. Nat. Mach. Intell.
**2019**, 1, 44–48. [Google Scholar] [CrossRef] [Green Version] - Zhang, G.P.; Qi, M. Neural Network Forecasting for Seasonal and Trend Time Series. Eur. J. Oper. Res.
**2005**, 160, 501–514. [Google Scholar] [CrossRef] - Kaplan, D.T.; Glass, L. Coarse-grained embeddings of time series: Random walks, Gaussian random processes, and deterministic chaos. Phys. D Nonlinear Phenom.
**1993**, 64, 431–454. [Google Scholar] [CrossRef] - Sugihara, G.; May, R.M. Nonlinear forecasting as a way of distinguishing chaos from measurement error in time series. Nature
**1990**, 344, 734–741. [Google Scholar] [CrossRef] [PubMed]

**Figure 1.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin for horizon 1 on dataset Callcenter.

**Figure 2.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin for horizon 1 on dataset Electricity.

**Figure 3.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin for horizon 1 on dataset Prices.

**Figure 4.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin for horizon 1 on dataset Stocks.

**Figure 5.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin over all horizons on dataset Callcenter. The estimated distributions showing a magenta marking are approximately gaussian distributed [44].

**Figure 6.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin over all horizons on dataset Electricity.

**Figure 7.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin over all horizons on dataset Prices.

**Figure 8.**Mirrored density plot of Mean Absolute Scaled Errors (MASE) computed with a rolling forecasting origin over all horizons on dataset Stocks.

**Table 1.**Average MASE of multi-step forecasts from Prophet, MAPA, the two multiresolution forecasting methods MRR using regression, and MRNN using a neural network, seasonal adapted ARIMA (SA), automatized ARIMA (AA), multilayer perceptron (MLP), long short-term memory (LSTM) and gradient boosting machine (XGB) for different horizons and for the mean over horizons from 1 to 7 and 1 to 14.

Dataset Callcenter | |||||||||
---|---|---|---|---|---|---|---|---|---|

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 2.0 | 2.0 | 2.1 | 2.2 | 2.1 | 2.1 | 1.9 | 2.1 | 2.0 |

LSTM | 1.5 | 1.6 | 1.5 | 1.5 | 1.5 | 1.6 | 1.3 | 1.5 | 1.4 |

MAPA | 1.4 | 1.4 | 1.4 | 1.4 | 1.3 | 1.4 | 1.3 | 1.4 | 1.4 |

MLP | 1.4 | 1.4 | 1.4 | 1.4 | 1.4 | 1.4 | 1.1 | 1.4 | 1.3 |

MRNN | 0.8 | 1.0 | 1.0 | 1.0 | 1.1 | 1.2 | 1.2 | 1.0 | 1.1 |

MRR | 1.1 | 1.1 | 1.1 | 1.1 | 1.1 | 1.2 | 1.2 | 1.1 | 1.1 |

NN | 0.8 | 1.0 | 3.3 | 3.4 | 3.7 | 2.3 | 2.9 | 2.6 | 2.6 |

Prophet | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 1.0 | 0.9 |

SA | 2.2 | 2.3 | 2.2 | 2.3 | 2.2 | 2.3 | 2.3 | 2.3 | 2.2 |

MRANN | 0.7 | 0.8 | 0.8 | 0.9 | 1.0 | 1.0 | 0.9 | 0.9 | 0.9 |

MRA | 1.7 | 2.1 | 2.1 | 2.3 | 2.1 | 2.1 | 1.9 | 2.1 | 2.0 |

XGB | 2.3 | 3.1 | 3.1 | 3.4 | 3.6 | 3.6 | 3.0 | 3.3 | 3.2 |

Dataset Electricity | |||||||||

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 1.7 | 1.7 | 1.8 | 1.7 | 1.6 | 1.7 | 1.8 | 1.7 | 1.8 |

LSTM | 2.6 | 2.9 | 3.0 | 3.1 | 2.9 | 2.9 | 2.3 | 2.9 | 2.6 |

MAPA | 0.6 | 0.7 | 0.8 | 0.8 | 0.8 | 0.9 | 0.9 | 0.8 | 0.8 |

MLP | 1.8 | 1.9 | 1.9 | 1.9 | 1.9 | 1.9 | 1.5 | 1.9 | 1.7 |

MRNN | 0.4 | 0.6 | 0.6 | 0.7 | 0.8 | 0.8 | 0.9 | 0.7 | 0.7 |

MRR | 0.6 | 0.7 | 0.8 | 0.8 | 0.8 | 0.8 | 0.7 | 0.7 | 0.7 |

NN | 0.5 | 0.7 | 3.3 | 3.4 | 3.5 | 2.1 | 2.7 | 2.4 | 2.4 |

Prophet | 0.8 | 0.8 | 0.7 | 0.7 | 0.8 | 0.8 | 0.6 | 0.8 | 0.7 |

SA | 1.7 | 2.1 | 2.1 | 2.1 | 2.0 | 2.0 | 2.1 | 2.0 | 2.0 |

MRANN | 0.4 | 0.6 | 0.6 | 0.7 | 0.7 | 0.8 | 0.8 | 0.6 | 0.7 |

MRA | 1.7 | 2.3 | 2.6 | 2.7 | 2.4 | 2.2 | 2.0 | 2.4 | 2.3 |

XGB | 2.2 | 3.1 | 3.2 | 3.3 | 3.4 | 3.3 | 2.8 | 3.1 | 3.0 |

**Table 2.**Average MASE of multi-step forecasts from Prophet, MAPA, the two multiresolution forecasting methods, MRR using regression and MRNN using a neural network, seasonal adapted ARIMA (SA), automatized ARIMA (AA), multilayer perceptron (MLP), long short-term memory (LSTM) and gradient boosting machine (XGB) for different horizons and for the mean over horizons from 1 to 7 and 1 to 14.

Dataset Stocks | |||||||||
---|---|---|---|---|---|---|---|---|---|

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.1 | 1.0 | 1.0 |

LSTM | 1.2 | 0.9 | 0.7 | 0.6 | 0.5 | 0.4 | 0.3 | 0.7 | 0.5 |

MAPA | 1.0 | 1.0 | 0.9 | 1.0 | 1.0 | 0.9 | 1.0 | 1.0 | 1.0 |

MLP | 1.2 | 0.8 | 0.7 | 0.6 | 0.5 | 0.4 | 0.3 | 0.7 | 0.5 |

MRNN | 1.0 | 1.0 | 1.0 | 1.1 | 1.1 | 1.1 | 1.2 | 1.1 | 1.1 |

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

MRR | 1.0 | 1.1 | 1.1 | 1.2 | 1.2 | 1.2 | 1.4 | 1.1 | 1.2 |

NN | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.1 | 1.0 | 1.0 |

Prophet | 3.3 | 2.8 | 2.4 | 2.2 | 1.7 | 1.5 | 1.5 | 2.3 | 1.9 |

SA | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.1 | 1.0 | 1.0 |

MRANN | 1.0 | 1.0 | 1.0 | 1.0 | 1.1 | 1.1 | 1.2 | 1.0 | 1.1 |

MRA | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.1 | 1.0 | 1.0 |

XGB | 1.5 | 1.5 | 1.5 | 1.5 | 1.4 | 1.4 | 1.3 | 1.5 | 1.4 |

Dataset Price | |||||||||

Horizon | Overall | ||||||||

Method | 1 | 2 | 3 | 4 | 6 | 7 | 14 | 7 | 14 |

AA | 1.2 | 1.1 | 1.1 | 1.0 | 1.2 | 1.3 | 1.3 | 1.1 | 1.2 |

LSTM | 1.3 | 0.9 | 0.8 | 0.8 | 0.8 | 0.8 | 0.7 | 0.9 | 0.8 |

MAPA | 1.1 | 0.9 | 0.9 | 0.9 | 1.1 | 1.2 | 1.2 | 1.0 | 1.1 |

MLP | 1.3 | 0.8 | 0.7 | 0.7 | 0.7 | 0.8 | 0.6 | 0.8 | 0.7 |

MRNN | 1.1 | 1.0 | 1.0 | 1.0 | 1.2 | 1.4 | 1.3 | 1.1 | 1.2 |

MRR | 1.1 | 1.0 | 1.0 | 1.0 | 1.2 | 1.3 | 1.3 | 1.1 | 1.2 |

NN | 1.2 | 1.2 | 1.1 | 1.1 | 1.2 | 1.2 | 1.2 | 1.2 | 1.2 |

Prophet | 3.9 | 2.7 | 2.5 | 2.4 | 2.5 | 2.8 | 2.3 | 2.7 | 2.5 |

SA | 1.3 | 1.1 | 1.0 | 1.0 | 1.1 | 1.3 | 1.2 | 1.1 | 1.1 |

MRANN | 1.1 | 1.0 | 1.0 | 1.0 | 1.2 | 1.3 | 1.3 | 1.1 | 1.2 |

MRA | 1.2 | 1.1 | 1.1 | 1.0 | 1.2 | 1.3 | 1.3 | 1.1 | 1.2 |

XGB | 1.8 | 1.6 | 1.7 | 1.6 | 2.0 | 2.3 | 2.3 | 1.8 | 2.0 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Stier, Q.; Gehlert, T.; Thrun, M.C.
Multiresolution Forecasting for Industrial Applications. *Processes* **2021**, *9*, 1697.
https://doi.org/10.3390/pr9101697

**AMA Style**

Stier Q, Gehlert T, Thrun MC.
Multiresolution Forecasting for Industrial Applications. *Processes*. 2021; 9(10):1697.
https://doi.org/10.3390/pr9101697

**Chicago/Turabian Style**

Stier, Quirin, Tino Gehlert, and Michael C. Thrun.
2021. "Multiresolution Forecasting for Industrial Applications" *Processes* 9, no. 10: 1697.
https://doi.org/10.3390/pr9101697