Skip to main content

Budget-Aware Scheduling for Hyperparameter Optimization Process in Cloud Environment

  • Conference paper
  • First Online:
Algorithms and Architectures for Parallel Processing (ICA3PP 2021)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 13157))

  • 1869 Accesses

Abstract

Hyperparameter optimization, as a necessary step for majority machine learning models, is crucial to achieving optimal model performance. Unfortunately, the process of hyperparameter optimization is usually computation-intensive and time-consuming due to the large searching space. To date, with the popularity and maturity of cloud computing, many researchers leverage public cloud services (i.e. Amazon AWS) to train machine learning models. Time and monetary cost, two contradictory targets, are what cloud machine learning users are more concerned about. In this paper, we propose HyperWorkflow, a workflow engine service for hyperparameter optimization execution, that coordinates between hyperparameter optimization job and cloud service instances. HyperWorkflow orchestrates the hyperparameter optimization process in a parallel and cost-effective manner upon heterogeneous cloud resources, and schedules hyperparameter trials using bin packing approach to make the best use of cloud resources to speed up the tuning processing under budget constraint. The evaluations show that HyperWorkflow can speed up hyperparameter optimization execution across a range of different budgets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Balandat, M., et al.: BoTorch: programmable Bayesian optimization in PyTorch. arxiv e-prints, arXiv-1910 (2019)

    Google Scholar 

  2. Bergstra, J., Bardenet, R., Bengio, Y., Kegl, B.: Algorithms for hyper-parameter optimization, vol. 24, pp. 2546–2554 (2011)

    Google Scholar 

  3. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(1), 281–305 (2012)

    MathSciNet  MATH  Google Scholar 

  4. Falkner, S., Klein, A., Hutter, F.: BOHB: robust and efficient hyperparameter optimization at scale. Proc. Mach. Learn. Res. 80, 1436–1445. PMLR (2018)

    Google Scholar 

  5. Gu, J., et al.: Tiresias: a GPU cluster manager for distributed deep learning. In: Lorch, J.R., Yu, M. (eds.) 16th USENIX Symposium on Networked Systems Design and Implementation, NSDI 2019, Boston, MA, 26–28 February 2019, pp. 485–500. USENIX Association (2019)

    Google Scholar 

  6. Guo, D., et al.: Image thresholding using a membrane algorithm based on enhanced particle swarm optimization with hyperparameter. Int. J. Unconv. Comput. 15(1–2), 83–106 (2020)

    Google Scholar 

  7. Hu, L., Liu, A., Xie, M., Wang, T.: UAVs joint vehicles as data mules for fast codes dissemination for edge networking in smart city. Peer-to-Peer Network. Appl. 12(6), 1550–1574 (2019)

    Article  Google Scholar 

  8. Kandasamy, K., et al.: Tuning hyperparameters without grad students: scalable and robust Bayesian optimisation with dragonfly. J. Mach. Learn. Res. 21(81), 1–27 (2020)

    MathSciNet  MATH  Google Scholar 

  9. Li, L., et al.: A system for massively parallel hyperparameter tuning. In: Dhillon, I.S., Papailiopoulos, D.S., Sze, V. (eds.) Proceedings of Machine Learning and Systems 2020, MLSys 2020, Austin, TX, USA, 2–4 March 2020 (2020). mlsys.org

  10. Li, L., Jamieson, K., DeSalvo, G., Rostamizadeh, A., Talwalkar, A.: Hyperband: a novel bandit-based approach to hyperparameter optimization. J. Mach. Learn. Res. 18(1), 6765–6816 (2017)

    MathSciNet  MATH  Google Scholar 

  11. Liaw, R., et al.: HyperSched: dynamic resource reallocation for model development on a deadline. In: Proceedings of the ACM Symposium on Cloud Computing, SoCC 2019, Santa Cruz, CA, USA, 20–23 November 2019, pp. 61–73. ACM (2019)

    Google Scholar 

  12. Misra, U., et al.: RubberBand: cloud-based hyperparameter tuning. In: Barbalace, A., Bhatotia, P., Alvisi, L., Cadar, C. (eds.) EuroSys 2021: Sixteenth European Conference on Computer Systems, Online Event, United Kingdom, 26–28 April 2021, pp. 327–342. ACM (2021)

    Google Scholar 

  13. Probst, P., Boulesteix, A., Bischl, B.: Tunability: importance of hyperparameters of machine learning algorithms. J. Mach. Learn. Res. 20, 53:1–53:32 (2019)

    Google Scholar 

  14. Shahriari, B., Swersky, K., Wang, Z., Adams, R.P., De Freitas, N.: Taking the human out of the loop: a review of Bayesian optimization. Proc. IEEE 104(1), 148–175 (2015)

    Article  Google Scholar 

  15. Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Neural Information Processing Systems, pp. 2951–2959 (2012)

    Google Scholar 

  16. Wang, T., Wang, P., Cai, S., Ma, Y., Liu, A., Xie, M.: A unified trustworthy environment establishment based on edge computing in industrial IoT. IEEE Trans. Ind. Inform. 16(9), 6083–6091 (2020)

    Article  Google Scholar 

  17. Wei, X., You, Z.: Neural network hyperparameter tuning based on improved genetic algorithm. In: ICCPR 2019: 8th International Conference on Computing and Pattern Recognition, Beijing, China, 23–25 October 2019, pp. 17–24. ACM (2019)

    Google Scholar 

Download references

Acknowledgment

This work was supported in part by the NSF of China under Grants 61771289 and 61832012, and the Key Research and Development Program of Shandong Province under Grant 2019JZZY020124, and the Key Program of Science and Technology of Shandong under Grant No. 2020CXGC010901.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jiguo Yu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yao, Y., Yu, J., Cao, J., Liu, Z. (2022). Budget-Aware Scheduling for Hyperparameter Optimization Process in Cloud Environment. In: Lai, Y., Wang, T., Jiang, M., Xu, G., Liang, W., Castiglione, A. (eds) Algorithms and Architectures for Parallel Processing. ICA3PP 2021. Lecture Notes in Computer Science(), vol 13157. Springer, Cham. https://doi.org/10.1007/978-3-030-95391-1_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-95391-1_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-95390-4

  • Online ISBN: 978-3-030-95391-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics