skip to main content
10.1145/3583131.3590364acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Hybridizing TPOT with Bayesian Optimization

Published:12 July 2023Publication History

ABSTRACT

Tree-based pipeline optimization tool (TPOT) is used to automatically construct and optimize machine learning pipelines for classification or regression tasks. The pipelines are represented as trees comprising multiple data transformation and machine learning operators --- each using discrete hyper-parameter spaces --- and optimized with genetic programming. During the evolution process, TPOT evaluates numerous pipelines which can be challenging when computing budget is limited. In this study, we integrate TPOT with Bayesian Optimization (BO) to extend its ability to search across continuous hyper-parameter spaces, and attempt to improve its performance when there is a limited computational budget. Multiple hybrid variants are proposed and systematically evaluated, including (a) sequential/periodic use of BO and (b) use of discrete/continuous search spaces for BO. The performance of these variants is assessed using 6 data sets with up to 20 features and 20,000 samples. Furthermore, an adaptive variant was designed where the choice of whether to apply TPOT or BO is made automatically in each generation. While the variants did not produce results that are significantly better than "standard" TPOT, the study uncovered important insights into the behavior and limitations of TPOT itself which is valuable in designing improved variants.

References

  1. Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation hyperparameter optimization framework. In ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2623--2631.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. James Bergstra, Rémi Bardenet, Yoshua Bengio, and Balázs Kégl. 2011. Algorithms for hyper-parameter optimization. Advances in Neural Information Processing Systems 24 (2011), 1--9.Google ScholarGoogle Scholar
  3. James Bergstra and Yoshua Bengio. 2012. Random search for hyper-parameter optimization. Journal of Machine Learning Research 13, 2 (2012), 281--305.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. James Bergstra, Daniel Yamins, and David Cox. 2013. Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures. In International Conference on Machine Learning. 115--123.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. François-Michel De Rainville, Félix-Antoine Fortin, Marc-André Gardner, Marc Parizeau, and Christian Gagné. 2012. DEAP: A Python framework for evolutionary algorithms. In Proceedings of the 14th annual conference companion on Genetic and evolutionary computation. 85--92.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Matthias Feurer, Aaron Klein, Katharina Eggensperger, Jost Springenberg, Manuel Blum, and Frank Hutter. 2015. Efficient and robust automated machine learning. In Advances in Neural Information Processing Systems 28 (2015). 2962--2970.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Pieter Gijsbers, Marcos LP Bueno, Stefan Coors, Erin LeDell, Sébastien Poirier, Janek Thomas, Bernd Bischl, and Joaquin Vanschoren. 2022. AMLB: an AutoML benchmark. arXiv preprint arXiv:2207.12560 (2022).Google ScholarGoogle Scholar
  8. Frank Hutter, Lars Kotthoff, and Joaquin Vanschoren. 2019. Automated machine learning: methods, systems, challenges. Springer Nature.Google ScholarGoogle Scholar
  9. Donald R Jones, Matthias Schonlau, and William J Welch. 1998. Efficient global optimization of expensive black-box functions. Journal of Global optimization 13, 4 (1998), 455--492.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Zohar Karnin, Tomer Koren, and Oren Somekh. 2013. Almost optimal exploration in multi-armed bandits. In International Conference on Machine Learning. 1238-- 1246.Google ScholarGoogle Scholar
  11. Lars Kotthoff, Chris Thornton, Holger H Hoos, Frank Hutter, and Kevin Leyton-Brown. 2019. Auto-WEKA: Automatic model selection and hyperparameter optimization in WEKA. In Automated Machine Learning. Springer, Cham, 81--95.Google ScholarGoogle Scholar
  12. Henry B Mann and Donald R Whitney. 1947. On a test of whether one of two random variables is stochastically larger than the other. The Annals of Mathematical Statistics (1947), 50--60.Google ScholarGoogle Scholar
  13. Randal S Olson, Nathan Bartley, Ryan J Urbanowicz, and Jason H Moore. 2016. Evaluation of a tree-based pipeline optimization tool for automating data science. In Proceedings of the genetic and evolutionary computation conference 2016. 485-- 492.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jasper Snoek, Hugo Larochelle, and Ryan P Adams. 2012. Practical bayesian optimization of machine learning algorithms. In Advances in Neural Information Processing Systems. 2951--2959.Google ScholarGoogle Scholar
  15. P. Stetsenko. 2020. Machine learning with Python and H2O. http://docs.h2o.ai/h2o/latest-stable/h2o-docs/booklets/PythonBooklet.pdfGoogle ScholarGoogle Scholar
  16. Anh Truong, Austin Walters, Jeremy Goodsitt, Keegan Hines, C Bayan Bruss, and Reza Farivar. 2019. Towards automated machine learning: Evaluation and comparison of AutoML approaches and tools. In 2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 1471--1479.Google ScholarGoogle ScholarCross RefCross Ref
  17. Xilu Wang, Yaochu Jin, Sebastian Schmitt, and Markus Olhofer. 2022. Recent advances in Bayesian optimization. arXiv preprint arXiv:2206.03301 (2022).Google ScholarGoogle Scholar

Index Terms

  1. Hybridizing TPOT with Bayesian Optimization

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '23: Proceedings of the Genetic and Evolutionary Computation Conference
      July 2023
      1667 pages
      ISBN:9798400701191
      DOI:10.1145/3583131

      Copyright © 2023 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 July 2023

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia
    • Article Metrics

      • Downloads (Last 12 months)86
      • Downloads (Last 6 weeks)11

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader