Skip to main content
Log in

Multi-step-ahead host load prediction using autoencoder and echo state networks in cloud computing

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. There are many proposals for resource management approaches for cloud infrastructures, but effective resource management is still a major challenge for the leading cloud infrastructure operators (e.g., Amazon, Microsoft, Google), because the details of the underlying workloads and the real-world operational demands are too complex. Among those proposals, accurate host load prediction is one of the most effective measures to address this challenge. In this paper, we proposed a new method for host load prediction, which uses an autoencoder as the pre-recurrent feature layer of the echo state networks. The aim of our proposed method is to predict the host load in the future interval based on Google cluster usage dataset. Experiments performed on Google load traces show that our proposed method achieves higher accuracy than the state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Akioka S, Muraoka Y (2004) Extended forecast of cpu and network load on computational grid. In: IEEE international symposium on cluster computing and the grid, 2004. CCGrid 2004. IEEE, pp 765–772

  2. Bengio Y, Simard P, Frasconi P (1994) Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw 5(2):157–166

    Article  Google Scholar 

  3. Bengio Y (2009) Learning deep architectures for ai. Found Trends\(^{\textregistered }\) Mach Learn 2(1):1–127

  4. Di S, Kondo D, Cirne W (2012) Characterization and comparison of cloud versus grid workloads. In: 2012 IEEE international conference on cluster computing (CLUSTER). IEEE, pp 230–238

  5. Di S, Kondo D, Cirne W (2012) Host load prediction in a google compute cloud with a bayesian model. In: Proceedings of the international conference on high performance computing, networking, storage and analysis. IEEE Computer Society Press, p 21

  6. Duy TVT, Sato Y, Inoguchi Y (2011) Improving accuracy of host load predictions on computational grids by artificial neural networks. Int J Parallel Emerg Distrib Syst 26(4):275–290

    Article  Google Scholar 

  7. Google (2011) Google cluster data. Google reach blog. http://googleresearch.blogspot.com/2011/11/more-google-cluster-data.html. Accessed 14 Oct 2013

  8. Guenter B, Jain N, Williams C (2011) Managing cost, performance, and reliability tradeoffs for energy-aware server provisioning. In: INFOCOM, 2011 Proceedings IEEE. IEEE, pp 1332–1340

  9. Hinton GE, Osindero S, Teh YW (2006) A fast learning algorithm for deep belief nets. Neural Comput 18(7):1527–1554

    Article  MathSciNet  MATH  Google Scholar 

  10. Hinton G, Deng L, Yu D, Dahl GE, Mohamed Ar, Jaitl N, Senior A, Vanhoucke V, Nguyen P, Sainath TN et al (2012) Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag 29(6):82–97

    Article  Google Scholar 

  11. Hochreiter S (1991) Untersuchungen zu dynamischen neuronalen netzen. Master’s thesis, Institut fur Informatik, Technische Universitat, Munchen

  12. Jaeger H (2001) The echo state approach to analysing and training recurrent neural networks-with an erratum note. Bonn Ger Ger Natl Res Cent Inf Technol GMD Tech Rep 148:34

    Google Scholar 

  13. Jaeger H, Haas H (2004) Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667):78–80

    Article  Google Scholar 

  14. Kang M, Kang DI, Crago SP, Park GL, Lee J (2011) Design and development of a run-time monitor for multi-core architectures in cloud computing. Sensors 11(4):3595–3610

    Article  Google Scholar 

  15. Kullback S, Leibler RA (1951) On information and sufficiency. Ann Math Stat 22:79–86

    Article  MathSciNet  MATH  Google Scholar 

  16. Lukoševičius M (2012) A practical guide to applying echo state networks. In: Neural networks: tricks of the trade. Springer, Berlin, pp 659–686

  17. Maass W, Natschläger T, Markram H (2002) Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput 14(11):2531–2560

    Article  Google Scholar 

  18. Mell P, Grance T (2009) The nist definition of cloud computing. Natl Inst Stand Technol 53(6):50

    Google Scholar 

  19. Osman S, Subhraveti D, Su G, Nieh J (2002) The design and implementation of zap: a system for migrating computing environments. ACM SIGOPS Opera Syst Rev 36(SI):361–376

    Article  Google Scholar 

  20. Reiss C, Tumanov A, Ganger GR, Katz RH, Kozuch MA (2012) Heterogeneity and dynamicity of clouds at scale: google trace analysis. In: Proceedings of the third ACM symposium on cloud computing. ACM, p 7

  21. Spratling MW (2006) Learning image components for object recognition. J Mach Learn Res 7:793–815

    MathSciNet  MATH  Google Scholar 

  22. Sutskever I, Martens J, Dahl G, Hinton G (2013) On the importance of initialization and momentum in deep learning. In: Proceedings of the 30th international conference on machine learning (ICML-13), pp 1139–1147

  23. Urgaonkar B, Shenoy P, Chandra A, Goyal P (2005) Dynamic provisioning of multi-tier internet applications. In: Second international conference on autonomic computing, 2005. ICAC 2005. Proceedings. IEEE, pp 217–228

  24. Werbos PJ (1990) Backpropagation through time: what it does and how to do it. Proc IEEE 78(10):1550–1560

    Article  MATH  Google Scholar 

  25. Wu Y, Yuan Y, Yang G, Zheng W (2007) Load prediction using hybrid model for computational grid. In: 2007 8th IEEE/ACM international conference on grid computing. IEEE, pp 235–242

  26. Yang L, Foster I, Schopf JM (2003) Homeostatic and tendency-based cpu load predictions. In: International parallel and distributed processing symposium, 2003. Proceedings. IEEE, p 9

  27. Yang Q, Peng C, Zhao H, Yu Y, Zhou Y, Wang Z, Du S (2014) A new method based on psr and ea-gmdh for host load prediction in cloud computing system. J Supercomput 68(3):1402–1417

    Article  Google Scholar 

  28. Zhang Q, Cheng L, Boutaba R (2010) Cloud computing: state-of-the-art and research challenges. J Internet Serv Appl 1(1):7–18

    Article  Google Scholar 

  29. Zhang Q, Zhani MF, Zhang S, Zhu Q, Boutaba R, Hellerstein JL (2012) Dynamic energy-aware capacity provisioning for cloud computing environments. In: Proceedings of the 9th international conference on autonomic computing. ACM, pp 145–154

Download references

Acknowledgments

This work was partially supported by Grant No. BE2011169, BK2011563 from the Natural Science Foundation of Jiangsu Province and Grant No. 61100111, 61300157, 61201425, 61271231 from the Natural Science Foundation of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qiangpeng Yang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, Q., Zhou, Y., Yu, Y. et al. Multi-step-ahead host load prediction using autoencoder and echo state networks in cloud computing. J Supercomput 71, 3037–3053 (2015). https://doi.org/10.1007/s11227-015-1426-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-015-1426-8

Keywords

Navigation