Skip to main content

Towards Automatic Bayesian Optimization: A First Step Involving Acquisition Functions

  • Conference paper
  • First Online:
Advances in Artificial Intelligence (CAEPIA 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 12882))

Included in the following conference series:

Abstract

Bayesian Optimization (BO) is the state of the art technique for the optimization of black boxes, i.e., functions where we do not have access to their analytical expression nor its gradients, are expensive to evaluate and its evaluation is noisy. A BO application is automatic hyperparameter tuning of machine learning algorithms. BO methodologies have hyperparameters that need to be configured such as the surrogate model or the acquisition function (AF). Bad decisions over the configuration of these hyperparameters implies obtaining bad results. Typically, these hyperparameters are tuned by making assumptions of the objective function that we want to evaluate but there are scenarios where we do not have any prior information. In this paper, we propose an attempt of automatic BO by exploring several heuristics that automatically tune the BO AF. We illustrate the effectiveness of these heurisitcs in a set of benchmark problems and a hyperparameter tuning problem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)

    MathSciNet  MATH  Google Scholar 

  2. Brochu, E., Cora, V.M., De Freitas, N.: A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. arXiv preprint arXiv:1012.2599 (2010)

  3. Calandra, R., Gopalan, N., Seyfarth, A., Peters, J., Deisenroth, M.P.: Bayesian gait optimization for bipedal locomotion. In: Battiti, R., Brunato, M., Kotsireas, I., Pardalos, P.M., (eds.) Learning and Intelligent Optimization LION 12 2018. LNCS, vol. 11353, pp. 274–290. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-05348-2

  4. Córdoba, I., Garrido-Merchán, E.C., Hernández-Lobato, D., Bielza, C., Larranaga, P.: Bayesian optimization of the PC algorithm for learning Gaussian Bayesian networks. In: Herrera, F., et al. (eds.) Conference of the Spanish Association for Artificial Intelligence. LNCS, pp. 44–54. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00374-6

    Chapter  Google Scholar 

  5. Davis, L.: Handbook of Genetic Algorithms

    Google Scholar 

  6. Gao, G., Reynolds, A.C., et al.: An improved implementation of the LBFGS algorithm for automatic history matching. In: SPE Annual Technical Conference and Exhibition. Society of Petroleum Engineers (2004)

    Google Scholar 

  7. Garrido-Merchán, E.C., Albarca-Molina, A.: Suggesting cooking recipes through simulation and Bayesian optimization. In: Yin, H., Camacho, D., Novais, P., Tallón-Ballesteros, A.J. (eds.) IDEAL 2018. LNCS, vol. 11314, pp. 277–284. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-03493-1_30

    Chapter  Google Scholar 

  8. Garrido-Merchán, E.C., Hernández-Lobato, D.: Dealing with integer-valued variables in Bayesian optimization with Gaussian processes. arXiv preprint arXiv:1706.03673 (2017)

  9. Garrido-Merchán, E.C., Hernández-Lobato, D.: Predictive entropy search for multi-objective Bayesian optimization with constraints. Neurocomputing 361, 50–68 (2019)

    Article  Google Scholar 

  10. Garrido-Merchán, E.C., Hernández-Lobato, D.: Dealing with categorical and integer-valued variables in Bayesian optimization with Gaussian processes. Neurocomputing 380, 20–35 (2020)

    Article  Google Scholar 

  11. Garrido-Merchán, E.C., Molina, M., Mendoza, F.M.: An artificial consciousness model and its relations with philosophy of mind. arXiv preprint arXiv:2011.14475 (2020)

  12. Glover, F.W., Kochenberger, G.A.: Handbook of Metaheuristics, vol. 57. Springer, Boston (2006). https://doi.org/10.1007/978-1-4419-1665-5

    Book  MATH  Google Scholar 

  13. Hernández-Lobato, J.M., Hoffman, M.W., Ghahramani, Z.: Predictive entropy search for efficient global optimization of black-box functions. In: Advances in Neural Information Processing Systems, pp. 918–926 (2014)

    Google Scholar 

  14. Ho, Y.-C., Pepyne, D.L.: Simple explanation of the no-free-lunch theorem and its implications. J. Optim. Theory Appl. 115(3), 549–570 (2002)

    Article  MathSciNet  Google Scholar 

  15. Kotthoff, L., Thornton, C., Hoos, H.H., Hutter, F., Leyton-Brown, K.: Auto-WEKA 2.0: automatic model selection and hyperparameter optimization in WEKA. J. Mach. Learn. Res. 18(1), 826–830 (2017)

    MathSciNet  Google Scholar 

  16. Markov, S.: Skopt documentation

    Google Scholar 

  17. Merchán, E.C.G., Molina, M.: A machine consciousness architecture based on deep learning and Gaussian processes. arXiv preprint arXiv:2002.00509 (2020)

  18. Rasmussen, C.E.: Gaussian processes in machine learning. In: Bousquet, O., von Luxburg, U., Rätsch, G. (eds.) ML 2003. LNCS (LNAI), vol. 3176, pp. 63–71. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-28650-9_4

    Chapter  Google Scholar 

  19. Shahriari, B., Swersky, K., Wang, Z., Adams, R.P., De Freitas, N.: Taking the human out of the loop: a review of Bayesian optimization. Proc. IEEE 104(1), 148–175 (2015)

    Article  Google Scholar 

  20. Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Advances in Neural Information Processing Systems, pp. 2951–2959 (2012)

    Google Scholar 

  21. Springenberg, J.T., Klein, A., Falkner, S., Hutter, F.: Bayesian optimization with robust Bayesian neural networks. In: Advances in Neural Information Processing Systems, pp. 4134–4142 (2016)

    Google Scholar 

  22. Wang, Z., Jegelka, S.: Max-value entropy search for efficient Bayesian optimization. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 3627–3635. JMLR.org (2017)

    Google Scholar 

Download references

Acknowledgements

Authors gratefully acknowledge the use of the facilities of Centro de Computacion Cientifica (CCC) at Universidad Autónoma de Madrid. The authors also acknowledge financial support from Spanish Plan Nacional I+D+i, grants TIN2016-76406-P and from PID2019-106827GB-I00/AEI/10.13039/501100011033.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eduardo C. Garrido Merchán .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jariego Pérez, L.C., Garrido Merchán, E.C. (2021). Towards Automatic Bayesian Optimization: A First Step Involving Acquisition Functions. In: Alba, E., et al. Advances in Artificial Intelligence. CAEPIA 2021. Lecture Notes in Computer Science(), vol 12882. Springer, Cham. https://doi.org/10.1007/978-3-030-85713-4_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85713-4_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85712-7

  • Online ISBN: 978-3-030-85713-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics