Skip to main content

ADAPT: Attention-Driven Domain Adaptation for Inter-cluster Workload Forecasting in Cloud Data Centers

  • Conference paper
  • First Online:
CLOUD Computing – CLOUD 2024 (CLOUD 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15423))

Included in the following conference series:

  • 95 Accesses

Abstract

Cloud computing has recently gained popularity due to its cost-effective and high-quality services. Cloud-native systems are expected to host more than 95% of digital workloads. Cloud service providers face two significant challenges: real-time workload predictions and effective resource management. Furthermore, allocating resources over time may result in a suboptimal execution environment due to considerable increases and decreases in workload that follow time-dependent patterns. Recent breakthroughs in deep learning have garnered widespread favor for predicting extremely nonlinear cloud workloads; nevertheless, they have been unable to generalize inter cluster workload forecasting due to inadequate workload data at the beginning of each cluster. Furthermore, the distribution disparity across distinct cluster workloads is caused by a variety of elements, making it difficult to reuse current data or models directly. To overcome these challenges, we propose ADAPT, which relies on Attention-Driven Domain Adaptation. First, we use LSTM architecture as the backbone of our model. Moreover, we construct a strategically shared attention module to transmit relevant knowledge from the source domain to the target domain by inducing domain-invariant latent features and retraining domain-specific features. Lastly, adversarial training is used to increase the model’s resilience and predictive accuracy. Comprehensive experimental evaluations indicate that our proposed approach significantly outperforms existing baselines.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Aswolinskiy, W., Hammer, B.: Unsupervised transfer learning for time series via self-predictive modelling-first results. In: Proceedings of the Workshop on New Challenges in Neural Computation (NC2), vol. 3 (2017)

    Google Scholar 

  2. Caglar, F., Gokhale, A.: ioverbook: intelligent resource-overbooking to support soft real-time applications in the cloud. In: 2014 IEEE 7th International Conference on Cloud Computing, pp. 538–545. IEEE (2014)

    Google Scholar 

  3. da Costa, P.R.d.O., Akçay, A., Zhang, Y., Kaymak, U.: Remaining useful lifetime prediction via deep domain adaptation. Reliability Eng. Syst. Safety 195, 106682 (2020)

    Google Scholar 

  4. Gong, Z., Gu, X., Wilkes, J.: Press: predictive elastic resource scaling for cloud systems. In: 2010 International Conference on Network and Service Management, pp. 9–16. IEEE (2010)

    Google Scholar 

  5. Goodfellow, I., et al.: Generative adversarial networks. Commun. ACM 63(11), 139–144 (2020)

    Article  MathSciNet  Google Scholar 

  6. Gretton, A., Borgwardt, K., Rasch, M., Schölkopf, B., Smola, A.: A kernel method for the two-sample-problem. Adv. Neural Inform. Process. Syst. 19 (2006)

    Google Scholar 

  7. Hershey, J.R., Olsen, P.A.: Approximating the kullback leibler divergence between gaussian mixture models. In: 2007 IEEE International Conference on Acoustics, Speech and Signal Processing-ICASSP 2007, vol. 4, pp. IV–317. IEEE (2007)

    Google Scholar 

  8. Hoffman, J., et al.: Cycada: cycle-consistent adversarial domain adaptation. In: International Conference on Machine Learning, pp. 1989–1998. PMLR (2018)

    Google Scholar 

  9. Hu, H., Tang, M., Bai, C.: Datsing: data augmented time series forecasting with adversarial domain adaptation. In: Proceedings of the 29th ACM International Conference on Information & Knowledge Management, pp. 2061–2064 (2020)

    Google Scholar 

  10. Jiang, J., Zhai, C.: Instance weighting for domain adaptation in nlp. ACL (2007)

    Google Scholar 

  11. Kim, Y.M., Song, S., Koo, B.M., Son, J., Lee, Y., Baek, J.G.: Enhancing long-term cloud workload forecasting framework: Anomaly handling and ensemble learning in multivariate time series. IEEE Trans. Cloud Comput. (2024)

    Google Scholar 

  12. Leka, H.L., Fengli, Z., Kenea, A.T., Tegene, A.T., Atandoh, P., Hundera, N.W.: A hybrid cnn-lstm model for virtual machine workload forecasting in cloud data center. In: 2021 18th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), pp. 474–478. IEEE (2021)

    Google Scholar 

  13. Long, M., Cao, Z., Wang, J., Jordan, M.I.: Conditional adversarial domain adaptation. Adv. Neural Inform. Process. Syst. 31 (2018)

    Google Scholar 

  14. Patel, Y.S., Bedi, J.: Mag-d: a multivariate attention network based approach for cloud workload forecasting. Futur. Gener. Comput. Syst. 142, 376–392 (2023)

    Article  Google Scholar 

  15. Podolskiy, V., Jindal, A., Gerndt, M., Oleynik, Y.: Forecasting models for self-adaptive cloud applications: a comparative study. In: 2018 ieee 12th international conference on self-adaptive and self-organizing systems (SASO), pp. 40–49. IEEE (2018)

    Google Scholar 

  16. Roy, N., Dubey, A., Gokhale, A.: Efficient autoscaling in the cloud using predictive models for workload forecasting. In: 2011 IEEE 4th International Conference on Cloud Computing, pp. 500–507. IEEE (2011)

    Google Scholar 

  17. Saxena, D., Kumar, J., Singh, A.K., Schmid, S.: Performance analysis of machine learning centered workload prediction models for cloud. IEEE Trans. Parallel Distrib. Syst. 34(4), 1313–1330 (2023)

    Article  Google Scholar 

  18. Sherstinsky, A.: Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Physica D 404, 132306 (2020)

    Article  MathSciNet  Google Scholar 

  19. Subramanian, S., Kannammal, A.: Real time non-linear cloud workload forecasting using the holt-winter model. In: 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), pp. 1–6. IEEE (2019)

    Google Scholar 

  20. Sun, B., Feng, J., Saenko, K.: Return of frustratingly easy domain adaptation. In: Proceedings of the AAAI Conference On Artificial Intelligence, vol. 30 (2016)

    Google Scholar 

  21. Tirmazi, M., et al.: Borg: the next generation. In: Proceedings of the Fifteenth European Conference on Computer Systems, pp. 1–14 (2020). https://doi.org/10.1145/3342195.338751

  22. Valarmathi, K., Kanaga Suba Raja, S.: Resource utilization prediction technique in cloud using knowledge based ensemble random forest with lstm model. Concurrent Eng. 29(4), 396–404 (2021)

    Google Scholar 

  23. Wang, M., Deng, W.: Deep visual domain adaptation: a survey. Neurocomputing 312, 135–153 (2018)

    Article  Google Scholar 

  24. Wu, Y., Liu, J., Wang, C., Xie, X., Shi, G.: Graph transformer and LSTM attention for VNF multi-step workload prediction in sfc. IEEE Trans. Netw. Service Manag. (2024)

    Google Scholar 

  25. Xi, H., Yan, C., Li, H., Xiao, Y.: An attention-based recurrent neural network for resource usage prediction in cloud data center. J. Phys. Conf. Ser. 2006, 012007 (2021)

    Google Scholar 

  26. Yazdanian, P., Sharifian, S.: E2lg: a multiscale ensemble of lstm/gan deep learning architecture for multistep-ahead cloud workload prediction. J. Supercomput. 77, 11052–11082 (2021)

    Article  Google Scholar 

  27. Zhang, Q., Zhani, M.F., Zhang, S., Zhu, Q., Boutaba, R., Hellerstein, J.L.: Dynamic energy-aware capacity provisioning for cloud computing environments. In: Proceedings of the 9th International Conference on Autonomic Computing, pp. 145–154 (2012)

    Google Scholar 

  28. Zheng, H., et al.: Energy optimisation in cloud datacentres with mc-tide: mixed channel time-series dense encoder for workload forecasting. Appl. Energy 374, 123903 (2024)

    Article  Google Scholar 

Download references

Acknowledgments

This research was partly supported by the 1) Korea Institute of Science and Technology Information(KISTI) 2) Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government(MSIT) (No. RS-2023-00220631, Edge Cloud Reference Architecture Standardization for Low Latency and Lightweight Cloud Service).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eui-Nam Huh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Mahbub, N.I., Sinthia, A.K., Jeon, M., Park, J., Huh, EN. (2025). ADAPT: Attention-Driven Domain Adaptation for Inter-cluster Workload Forecasting in Cloud Data Centers. In: Wang, Y., Zhang, LJ. (eds) CLOUD Computing – CLOUD 2024. CLOUD 2024. Lecture Notes in Computer Science, vol 15423. Springer, Cham. https://doi.org/10.1007/978-3-031-77153-8_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-77153-8_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-77152-1

  • Online ISBN: 978-3-031-77153-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics