Next Article in Journal
Bayesian-Optimized GCN-BiLSTM-Adaboost Model for Power-Load Forecasting
Previous Article in Journal
Lightweight Model Improvement and Application for Rice Disease Classification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Workload Prediction for Proactive Resource Allocation in Large-Scale Cloud-Edge Applications

1
Product Development, Tieto Sweden Support Services AB, 907 36 Umeå, Sweden
2
Department of Computing Science, Umeå University, 901 87 Umeå, Sweden
*
Authors to whom correspondence should be addressed.
Electronics 2025, 14(16), 3333; https://doi.org/10.3390/electronics14163333
Submission received: 1 July 2025 / Revised: 18 August 2025 / Accepted: 19 August 2025 / Published: 21 August 2025
(This article belongs to the Special Issue Next-Generation Cloud–Edge Computing: Systems and Applications)

Abstract

Accurate workload prediction is essential for proactive resource allocation in large-scale Content Delivery Networks (CDNs), where traffic patterns are highly dynamic and geographically distributed. This paper introduces a CDN-tailored prediction and autoscaling framework that integrates statistical and deep learning models within an adaptive feedback loop. The framework is evaluated using 18 months of real traffic traces from a production multi-tier CDN, capturing realistic workload seasonality, cache–tier interactions, and propagation delays. Unlike generic cloud-edge predictors, our design incorporates CDN-specific features and model-switching mechanisms to balance prediction accuracy with computational cost. Seasonal ARIMA (S-ARIMA), Long Short-Term Memory (LSTM), Bidirectional LSTM (Bi-LSTM), and Online Sequential Extreme Learning Machine (OS-ELM) are combined to support both short-horizon scaling and longer-term capacity planning. The predictions drive a queue-based resource-estimation model, enabling proactive cache–server scaling with low rejection rates. Experimental results demonstrate that the framework maintains high accuracy while reducing computational overhead through adaptive model selection. The proposed approach offers a practical, production-tested solution for predictive autoscaling in CDNs and can be extended to other latency-sensitive edge-cloud services with hierarchical architectures.
Keywords: workload modeling; workload prediction; resource allocation; prediction framework; ARIMA; LSTM; OS-ELM; CDN workload modeling; workload prediction; resource allocation; prediction framework; ARIMA; LSTM; OS-ELM; CDN
Graphical Abstract

Share and Cite

MDPI and ACS Style

Duc, T.L.; Nguyen, C.; Östberg, P.-O. Workload Prediction for Proactive Resource Allocation in Large-Scale Cloud-Edge Applications. Electronics 2025, 14, 3333. https://doi.org/10.3390/electronics14163333

AMA Style

Duc TL, Nguyen C, Östberg P-O. Workload Prediction for Proactive Resource Allocation in Large-Scale Cloud-Edge Applications. Electronics. 2025; 14(16):3333. https://doi.org/10.3390/electronics14163333

Chicago/Turabian Style

Duc, Thang Le, Chanh Nguyen, and Per-Olov Östberg. 2025. "Workload Prediction for Proactive Resource Allocation in Large-Scale Cloud-Edge Applications" Electronics 14, no. 16: 3333. https://doi.org/10.3390/electronics14163333

APA Style

Duc, T. L., Nguyen, C., & Östberg, P.-O. (2025). Workload Prediction for Proactive Resource Allocation in Large-Scale Cloud-Edge Applications. Electronics, 14(16), 3333. https://doi.org/10.3390/electronics14163333

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop