Forecasting Robust Gaussian Process State Space Models for Assessing Intervention Impact in Internet of Things Time Series
Abstract
:1. Introduction
2. Review of Variants of Gaussian Process State Space Models
2.1. Review of Gaussian Process State Space Models
2.2. Review of Probabilistic Recurrent State Space Models
3. Robust Gaussian State Space Model
- Step 1. We initialize the variational parameters.
- Step 2. Using current variational parameters, we generate the n-dimensional latent state trajectories by sampling from multivariate Gaussian , and we sample from .
- Step 3. We evaluate the ELBO in (20) at the current values of the variational parameters and the generated samples in Step 2.
- Step 4. We update the variational parameters via a backward pass of the gradient descent algorithm using the *autodiff routine* in PyTorch.
4. Application: Intervention Impact Analysis for IoT Temperature Time Series
5. Summary and Discussion
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Review of Gaussian Process (GP) Regression Models for Time Series
- K1. Periodic kernels that capture seasonal behavior in the time series are defined as follows:
- K2. The radial basis kernel, which models short-term to medium-term variations in as a function of t and , is defined as follows:
- K3. The spectral mixture kernels [14] can extrapolate non-linear patterns in a time series and are defined as follows:
- K4. We can also use the additive kernel composition as follows [11]:
Appendix B. Review of Variational Approximations
Appendix B.1. Variational Inference and the Evidence Lower Bound (ELBO)
Appendix B.2. Stochastic Variational Inference
Algorithm 1: Generic SVI Algorithm [43] |
Appendix B.3. Black Box Variational Inference
Algorithm 2: Generic BBVI Algorithm [45] |
References
- Malki, A.; Atlam, E.S.; Gad, I. Machine learning approach of detecting anomalies and forecasting time-series of IoT devices. Alex. Eng. J. 2022, 61, 8973–8986. [Google Scholar] [CrossRef]
- Zhang, D.; Lindholm, G.; Ratnaweera, H. Use long short-term memory to enhance Internet of Things for combined sewer overflow monitoring. J. Hydrol. 2018, 556, 409–418. [Google Scholar] [CrossRef]
- Toman, P.; Soliman, A.; Ravishanker, N.; Rajasekaran, S.; Lally, N.; D’Addeo, H. Understanding insured behavior through causal analysis of IoT streams. In Proceedings of the 2023 6th International Conference on Data Mining and Knowledge Discovery (DMKD 2023), Seoul, Republic of Korea, 17–19 March 2023. [Google Scholar]
- Soliman, A.; Rajasekaran, S.; Toman, P.; Ravishanker, N.; Lally, N.G.; D’Addeo, H. A custom unsupervised approach for pipe-freeze online anomaly detection. In Proceedings of the 2021 IEEE 7th World Forum on Internet of Things (WF-IoT), New Orleans, LA, USA, 14 June– 31 July 2021; pp. 663–668. [Google Scholar]
- Box, G.E.; Tiao, G.C. Intervention analysis with applications to economic and environmental problems. J. Am. Stat. Assoc. 1975, 70, 70–79. [Google Scholar] [CrossRef]
- Abraham, B. Intervention analysis and multiple time series. Biometrika 1980, 67, 73–78. [Google Scholar] [CrossRef]
- Aminikhanghahi, S.; Cook, D.J. A survey of methods for time series change point detection. Knowl. Inf. Syst. 2017, 51, 339–367. [Google Scholar] [CrossRef]
- Brodersen, K.H.; Gallusser, F.; Koehler, J.; Remy, N.; Scott, S.L. Inferring causal impact using Bayesian structural time-series models. Ann. Appl. Stat. 2015, 9, 247–274. [Google Scholar] [CrossRef]
- Scott, S.L.; Varian, H.R. Predicting the present with Bayesian structural time series. Int. J. Math. Model. Numer. Optim. 2014, 5, 4–23. [Google Scholar] [CrossRef]
- Williams, C.K.; Rasmussen, C.E. Gaussian Processes for Machine Learning; MIT Press: Cambridge, MA, USA, 2006; Volume 2. [Google Scholar]
- Corani, G.; Benavoli, A.; Zaffalon, M. Time series forecasting with Gaussian processes needs priors. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases; Springer: Cham, Switzerland, 2021; pp. 103–117. [Google Scholar]
- Roberts, S.; Osborne, M.; Ebden, M.; Reece, S.; Gibson, N.; Aigrain, S. Gaussian processes for time-series modelling. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2013, 371, 20110550. [Google Scholar] [CrossRef] [PubMed]
- Duvenaud, D.; Lloyd, J.; Grosse, R.; Tenenbaum, J.; Zoubin, G. Structure discovery in nonparametric regression through compositional kernel search. In Proceedings of the International Conference on Machine Learning, PMLR, Atlanta, GA, USA, 17–19 June 2013; pp. 1166–1174. [Google Scholar]
- Wilson, A.; Adams, R. Gaussian process kernels for pattern discovery and extrapolation. In Proceedings of the International Conference on Machine Learning, PMLR, Atlanta, GA, USA, 17–19 June 2013; pp. 1067–1075. [Google Scholar]
- Brahim-Belhouari, S.; Bermak, A. Gaussian process for nonstationary time series prediction. Comput. Stat. Data Anal. 2004, 47, 705–712. [Google Scholar] [CrossRef]
- Durbin, J.; Koopman, S.J. Time Series Analysis by State Space Methods; OUP Oxford: Oxford, UK, 2012; Volume 38. [Google Scholar]
- Shumway, R.H.; Stoffer, D.S.; Stoffer, D.S. Time Series Analysis and its Applications, 3rd ed.; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
- West, M.; Harrison, J. Bayesian Forecasting and Dynamic Models; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Frigola, R.; Lindsten, F.; Schön, T.B.; Rasmussen, C.E. Bayesian inference and learning in Gaussian process state-space models with particle MCMC. Adv. Neural Inf. Process. Syst. 2013, 26. Available online: https://arxiv.org/abs/1306.2861 (accessed on 23 May 2025).
- Doerr, A.; Daniel, C.; Schiegg, M.; Duy, N.T.; Schaal, S.; Toussaint, M.; Sebastian, T. Probabilistic recurrent state-space models. In Proceedings of the International Conference on Machine Learning, PMLR, Stockholm, Sweden, 10–15 July 2018; pp. 1280–1289. [Google Scholar]
- Deisenroth, M.P.; Huber, M.F.; Hanebeck, U.D. Analytic moment-based Gaussian process filtering. In Proceedings of the 26th Annual International Conference on Machine Learning, Montreal, QC, Canada, 14–18 June 2009; pp. 225–232. [Google Scholar]
- Turner, R.; Deisenroth, M.; Rasmussen, C. State-space inference and learning with Gaussian processes. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, Sardinia, Italy, 13–15 May 2010; pp. 868–875. [Google Scholar]
- Frigola, R.; Chen, Y.; Rasmussen, C.E. Variational Gaussian process state-space models. Adv. Neural Inf. Process. Syst. 2014, 27. Available online: https://arxiv.org/abs/1406.4905 (accessed on 23 May 2025).
- Blei, D.M.; Kucukelbir, A.; McAuliffe, J.D. Variational inference: A review for statisticians. J. Am. Stat. Assoc. 2017, 112, 859–877. [Google Scholar] [CrossRef]
- Eleftheriadis, S.; Nicholson, T.; Deisenroth, M.; Hensman, J. Identification of Gaussian process state space models. Adv. Neural Inf. Process. Syst. 2017, 30. Available online: https://arxiv.org/abs/1705.10888 (accessed on 23 May 2025).
- Ialongo, A.D.; Van Der Wilk, M.; Hensman, J.; Rasmussen, C.E. Overcoming mean-field approximations in recurrent Gaussian process models. In Proceedings of the International Conference on Machine Learning, PMLR, Long Beach, CA, USA, 9–15 June 2019; pp. 2931–2940. [Google Scholar]
- Toman, P.; Ravishanker, N.; Lally, N.; Rajasekaran, S. Latent autoregressive Student-t prior process models to assess impact of interventions in time series. Future Internet 2023, 16, 8. [Google Scholar] [CrossRef]
- West, M. On scale mixtures of normal distributions. Biometrika 1987, 74, 646–648. [Google Scholar] [CrossRef]
- Ravishanker, N.; Chi, Z.; Dey, D.K. A First Course in Linear Model Theory; Chapman and Hall/CRC: Boca Raton, FL, USA, 2021. [Google Scholar]
- Schuster, M.; Paliwal, K.K. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 1997, 45, 2673–2681. [Google Scholar] [CrossRef]
- Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L.; et al. Pytorch: An imperative style, high-performance deep learning library. Adv. Neural Inf. Process. Syst. 2019, 32. Available online: https://arxiv.org/abs/1912.01703 (accessed on 23 May 2025).
- Gardner, J.; Pleiss, G.; Weinberger, K.Q.; Bindel, D.; Wilson, A.G. Gpytorch: Blackbox matrix-matrix Gaussian process inference with gpu acceleration. Adv. Neural Inf. Process. Syst. 2018, arXiv:1809.11165. [Google Scholar] [CrossRef]
- Lin, Z.; Cheng, L.; Yin, F.; Xu, L.; Cui, S. Output-Dependent Gaussian Process State-Space Model. In Proceedings of the ICASSP 2023—2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodos, Greece, 4–10 June 2023; pp. 1–5. [Google Scholar]
- Gneiting, T.; Raftery, A.E. Strictly proper scoring rules, prediction, and estimation. J. Am. Stat. Assoc. 2007, 102, 359–378. [Google Scholar] [CrossRef]
- Lloyd, J.; Duvenaud, D.; Grosse, R.; Tenenbaum, J.; Ghahramani, Z. Automatic construction and natural-language description of nonparametric regression models. In Proceedings of the AAAI Conference on Artificial Intelligence, Québec City, QC, Canada, 27–31 July 2014; Volume 28. [Google Scholar]
- Snelson, E.; Ghahramani, Z. Sparse Gaussian processes using pseudo-inputs. Adv. Neural Inf. Process. Syst. 2005, 18. Available online: https://papers.nips.cc/paper_files/paper/2005/hash/4491777b1aa8b5b32c2e8666dbe1a495-Abstract.html (accessed on 23 May 2025).
- Titsias, M. Variational learning of inducing variables in sparse Gaussian processes. In Proceedings of the Artificial Intelligence and Statistics, PMLR, Clearwater Beach, FL, USA, 16–18 April 2009; pp. 567–574. [Google Scholar]
- Hensman, J.; Fusi, N.; Lawrence, N.D. Gaussian processes for big data. arXiv 2013, arXiv:1309.6835. [Google Scholar]
- Hensman, J.; Matthews, A.; Ghahramani, Z. Scalable variational Gaussian process classification. In Proceedings of the Artificial Intelligence and Statistics, PMLR, San Diego, CA, USA, 9–12 May 2015; pp. 351–360. [Google Scholar]
- Girard, A.; Rasmussen, C.E.; Quinonero-Candela, J.; Murray-Smith, R.; Winther, O.; Larsen, J. Multiple-step ahead prediction for non-linear dynamic systems—A Gaussian process treatment with propagation of the uncertainty. Adv. Neural Inf. Process. Syst. 2002, 15, 529–536. [Google Scholar]
- Mattos, C.L.C.; Damianou, A.; Barreto, G.A.; Lawrence, N.D. Latent autoregressive Gaussian processes models for robust system identification. IFAC-PapersOnLine 2016, 49, 1121–1126. [Google Scholar] [CrossRef]
- Lee, H.; Yun, E.; Yang, H.; Lee, J. Scale mixtures of neural network Gaussian processes. arXiv 2021, arXiv:2107.01408. [Google Scholar]
- Hoffman, M.D.; Blei, D.M.; Wang, C.; Paisley, J. Stochastic variational inference. J. Mach. Learn. Res. 2013, 14, 1303–1347. [Google Scholar]
- Bishop, C.M.; Nasrabadi, N.M. Pattern Recognition and Machine Learning; Springer: Berlin/Heidelberg, Germany, 2006; Volume 4. [Google Scholar]
- Ranganath, R.; Gerrish, S.; Blei, D. Black box variational inference. In Proceedings of the Artificial Intelligence and Statistics, PMLR, Valencia, Spain, 22–25 April 2014; pp. 814–822. [Google Scholar]
Model | RMSE | sMAPE | CRPS | PICP |
---|---|---|---|---|
PRSSM | 1.939 | 0.030 | 1.476 | 0.6784 |
R-PRSSM | 3.209 | 0.049 | 2.454 | 0.9722 |
Model | RMSE | SMAPE | CRPS | PICP |
---|---|---|---|---|
PRSSM | 3.818 | 0.063 | 2.789 | 0.689 |
R-PRSSM | 6.688 | 0.108 | 5.178 | 0.981 |
(a) PRSSM | ||
Human Labels | ||
Predicted Labels | No Action | Action |
No Action | 32 | 4 |
Action | 8 | 6 |
(b) Robust PRSSM | ||
Human Labels | ||
Predicted Labels | No action | Action |
No action | 34 | 8 |
Action | 6 | 2 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Toman, P.; Ravishanker, N.; Lally, N.; Rajasekaran, S. Forecasting Robust Gaussian Process State Space Models for Assessing Intervention Impact in Internet of Things Time Series. Forecasting 2025, 7, 22. https://doi.org/10.3390/forecast7020022
Toman P, Ravishanker N, Lally N, Rajasekaran S. Forecasting Robust Gaussian Process State Space Models for Assessing Intervention Impact in Internet of Things Time Series. Forecasting. 2025; 7(2):22. https://doi.org/10.3390/forecast7020022
Chicago/Turabian StyleToman, Patrick, Nalini Ravishanker, Nathan Lally, and Sanguthevar Rajasekaran. 2025. "Forecasting Robust Gaussian Process State Space Models for Assessing Intervention Impact in Internet of Things Time Series" Forecasting 7, no. 2: 22. https://doi.org/10.3390/forecast7020022
APA StyleToman, P., Ravishanker, N., Lally, N., & Rajasekaran, S. (2025). Forecasting Robust Gaussian Process State Space Models for Assessing Intervention Impact in Internet of Things Time Series. Forecasting, 7(2), 22. https://doi.org/10.3390/forecast7020022