skip to main content
10.1145/3377929.3389952acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

Hybrid bayesian evolutionary optimization for hyperparameter tuning

Published: 08 July 2020 Publication History

Abstract

In this paper, we present a Hybrid Bayesian-Evolutionary tuning algorithm (HBEtune) for tuning machine learning algorithms or evolutionary algorithms, and analyze its performance. HBEtune combines meta-EA and Bayesian optimization techniques.
As hyperparameter tuning is a noisy, black-box optimization problem with expensive target functions, practical tuners must aim to minimize the number of necessary samples. In our method, we guide the EA's recombination operator towards more promising samples by employing the expected improvement acquisition criterion commonly used in Bayesian optimization. The expected improvement is evaluated on a surrogate model using a Gaussian process regression.
HBEtune shows generally competitive performance when compared with the state of the art irace tuner. Performance is analyzed across a suite of synthetic and real-world benchmark problems.

References

[1]
K. S. Anderson and YuHong Hsu. 1999. Genetic crossover strategy using an approximation concept. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99. IEEE, 527--533.
[2]
Thomas Bartz-Beielstein, Christian Lasarczyk, and Mike Preuss. 2005. Sequential parameter optimization. In Evolutionary Computation, 2005. CEC'05. IEEE Congress on. Edinburgh, Scotland, 773--780. http://www.spotseven.de/wp-content/papercite-data/pdf/blp05.pdf
[3]
James Bergstra, Daniel Yamins, and David Daniel Cox. 2013. Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. In Proceedings of the 30th International Conference on Machine Learning - Volume 28 (ICML '13). JMLR, 115--123.
[4]
Eduardo C. Garrido-Merchán and Daniel Hernández-Lobato. 2020. Dealing with categorical and integer-valued variables in Bayesian Optimization with Gaussian processes. Neurocomputing 380 (2020), 20--35.
[5]
Frank Hutter. 2009. Automated configuration of algorithms for solving hard computational problems. Ph.D. Dissertation. University of British Columbia.
[6]
Yaochu Jin. 2005. A comprehensive survey of fitness approximation in evolutionary computation. Soft computing 9, 1 (2005), 3--12.
[7]
Donald R Jones, Matthias Schonlau, and William J Welch. 1998. Efficient global optimization of expensive black-box functions. Journal of Global optimization 13, 4 (1998), 455--492.
[8]
J. Knowles. 2006. ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Transactions on Evolutionary Computation 10, 1 (Feb 2006), 50--66.
[9]
Manuel López-Ibáñez, Jérémie Dubois-Lacoste, Leslie Pérez Cáceres, Mauro Birattari, and Thomas Stützle. 2016. The irace package: Iterated racing for automatic algorithm configuration. Operations Research Perspectives 3 (2016), 43--58. https://CRAN.R-project.org/package=irace
[10]
Khaled Rasheed and Haym Hirsh. 2000. Informed operators: Speeding up genetic-algorithm-based design optimization using reduced models. In Proceedings of the 2nd Annual Conference on Genetic and Evolutionary Computation. 628--635.
[11]
Carl Edward Rasmussen and Christopher K. I. Williams. 2006. Gaussian processes for machine learning. MIT Press, Cambridge, MA, USA. http://www.gaussianprocess.org/gpml/
[12]
Ibai Roman, Josu Ceberio, Alexander Mendiburu, and Jose A Lozano. 2016. Bayesian optimization for parameter tuning in evolutionary algorithms. In Evolutionary Computation (CEC), 2016 IEEE Congress on. IEEE, 4839--4845.
[13]
Bobak Shahriari, Kevin Swersky, Ziyu Wang, Ryan P Adams, and Nando De Freitas. 2015. Taking the human out of the loop: A review of Bayesian optimization. Proc. IEEE 104, 1 (2015), 148--175.
[14]
Jasper Snoek, Hugo Larochelle, and Ryan P Adams. 2012. Practical bayesian optimization of machine learning algorithms. In Advances in neural information processing systems. 2951--2959.
[15]
Thomas Stützle. 2002. ACOTSP v1.03. http://www.aco-metaheuristic.org/aco-code/ a software package implementing various Ant Colony Optimization algorithms applied to the symmetric Traveling Salesman Problem.
[16]
Vanessa Volz, Günter Rudolph, and Boris Naujoks. 2017. Surrogate-Assisted Partial Order-Based Evolutionary Optimisation. In Evolutionary Multi-Criterion Optimization, Heike Trautmann, Günter Rudolph, Kathrin Klamroth, Oliver Schütze, Margaret Wiecek, Yaochu Jin, and Christian Grimme (Eds.). Springer, Cham, 639--653.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '20: Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion
July 2020
1982 pages
ISBN:9781450371278
DOI:10.1145/3377929
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 July 2020

Check for updates

Author Tags

  1. algorithm configuration
  2. bayesian optimization
  3. evolutionary algorithms
  4. gaussian process regression
  5. parameter tuning

Qualifiers

  • Poster

Funding Sources

  • German Federal Ministry for Economic Affairs and Energy

Conference

GECCO '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 112
    Total Downloads
  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media