skip to main content
10.1145/3319619.3326856acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Benchmarking GNN-CMA-ES on the BBOB noiseless testbed

Published:13 July 2019Publication History

ABSTRACT

We evaluate in this paper the GNN-CMA-ES algorithm on the BBOB noiseless testbed. The GNN-CMA-ES algorithm was recently proposed as a plug-in extension to CMA-ES, introducing the possibility to train flexible search distributions, in contrast to standard search distributions (such as the multivariate Gaussian). By comparing GNN-CMA-ES and CMA-ES, we show the benefits of this extension on some unimodal functions as well as on a variety of multimodal functions. We also identify a family of unimodal functions where GNN-CMA-ES can degrade the performances of CMA-ES and discuss the possible reasons behind this behavior.

References

  1. Anne Auger and Nikolaus Hansen. 2005. A restart CMA evolution strategy with increasing population size. In 2005 IEEE congress on evolutionary computation, Vol. 2. IEEE, 1769--1776.Google ScholarGoogle Scholar
  2. Joshua V Dillon, Ian Langmore, Dustin Tran, Eugene Brevdo, Srinivas Vasudevan, Dave Moore, Brian Patton, Alex Alemi, Matt Hoffman, and Rif A Saurous. 2017. Tensorflow distributions. arXiv preprint arXiv:1711.10604 (2017).Google ScholarGoogle Scholar
  3. Laurent Dinh, David Krueger, and Yoshua Bengio. 2014. NICE: Non-Linear Independent Components Estimation. arXiv preprint arXiv:1410.8516 (2014).Google ScholarGoogle Scholar
  4. Laurent Dinh, Jascha Sohl-Dickstein, and Samy Bengio. 2016. Density Estimation using Real NVP. arXiv preprint arXiv:1605.08803 (2016).Google ScholarGoogle Scholar
  5. Louis Faury, Clement Calauzenes, Olivier Fercoq, and Syrine Krichen. 2019. Improving Evolutionary Strategies with Generative Neural Networks. arXiv preprint arXiv:1901.11271 (2019).Google ScholarGoogle Scholar
  6. S. Finck, N. Hansen, R. Ros, and A. Auger. 2009. Real-Parameter Black-Box Optimization Benchmarking 2009: Presentation of the Noiseless Functions. Technical Report 2009/20. Research Center PPE. http://coco.lri.fr/downloads/download15.03/bbobdocfunctions.pdf Updated February 2010.Google ScholarGoogle Scholar
  7. Ian Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley Sherjil Ozair, Aaron Courville, and Yoshua Bengio. 2014. Generative Adversarial Nets. In Advances in Neural Information Processing Systems. 2672--2680. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Nikolaus Hansen. 2016. The CMA Evolution Strategy: A tutorial. arXiv preprint arXiv:1604.00772 (2016).Google ScholarGoogle Scholar
  9. Nikolaus Hansen, Youhei Akimoto, and Petr Baudis. 2019. CMA-ES/pycma on Github. Zenodo, (Feb. 2019).Google ScholarGoogle Scholar
  10. N. Hansen, A Auger, D. Brockhoff, D. Tušar, and T. Tušar. 2016. COCO: Performance Assessment. ArXiv e-prints arXiv:1605.03560 (2016).Google ScholarGoogle Scholar
  11. N. Hansen, A. Auger, S. Finck, and R. Ros. 2012. Real-Parameter Black-Box Optimization Benchmarking 2012: Experimental Setup. Technical Report. INRIA. http://coco.gforge.inria.fr/bbob2012-downloadsGoogle ScholarGoogle Scholar
  12. N. Hansen, A. Auger, O. Mersmann, T. Tušar, and D. Brockhoff. 2016. COCO: A Platform for Comparing Continuous Optimizers in a Black-Box Setting. ArXiv e-prints arXiv: 1603.08785 (2016).Google ScholarGoogle Scholar
  13. N. Hansen, S. Finck, R. Ros, and A. Auger. 2009. Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Technical Report RR-6829. INRIA. http://coco.lri.fr/downloads/download15.03/bbobdocfunctions.pdf Updated February 2010.Google ScholarGoogle Scholar
  14. Nikolaus Hansen and Andreas Ostermeier. 2001. Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation 9, 2 (2001), 159--195. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. N. Hansen, T. Tušar, O. Mersmann, A. Auger, and D. Brockhoff. 2016. COCO: The Experimental Procedure. ArXiv e-prints arXiv:1603.08776 (2016).Google ScholarGoogle Scholar
  16. Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).Google ScholarGoogle Scholar
  17. David JC MacKay. 1995. Bayesian neural networks and density networks. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment 354, 1 (1995), 73--80.Google ScholarGoogle ScholarCross RefCross Ref
  18. Kenneth Price. 1997. Differential evolution vs. the functions of the second ICEO. In Proceedings of the IEEE International Congress on Evolutionary Computation. 153--157.Google ScholarGoogle Scholar
  19. Ingo Rechenberg. 1978. Evolutionsstrategien. In Simulationsmethoden in der Medizin und Biologie. Springer, 83--114.Google ScholarGoogle Scholar
  20. Oren Rippel and Ryan Prescott Adams. 2013. High-dimensional Probability Estimation with Deep Density Models. arXiv preprint arXiv:1302.5125 (2013).Google ScholarGoogle Scholar
  21. John Schulman, Filip Wolski, Prafulla Dhariwal, Alec Radford, and Oleg Klimov. 2017. Proximal policy optimization algorithms. arXiv preprint arXiv:1707.06347 (2017).Google ScholarGoogle Scholar
  22. Hans-Paul Schwefel. 1977. Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie: mit einer vergleichenden Einführung in die Hill-Climbing-und Zufallsstrategie. Birkhäuser.Google ScholarGoogle Scholar
  23. Adith Swaminathan and Thorsten Joachims. 2015. Counterfactual Risk Minimization: Learning from Logged Bandit Feedback. In International Conference on Machine Learning. 814--823. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Daan Wierstra, Tom Schaul, Jan Peters, and Juergen Schmidhuber. 2008. Natural Evolution Strategies. In Evolutionary omputation, 2008. CEC 2008. (IEEE World Congress on Computational Intelligence). IEEE, 3381--3387.Google ScholarGoogle Scholar
  25. Ronald J Williams. 1992. Simple statistical gradient-following algorithms for connectionist Reinforcement Learning. Machine Learning 8, 3--4 (1992), 229--256. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Benchmarking GNN-CMA-ES on the BBOB noiseless testbed

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '19: Proceedings of the Genetic and Evolutionary Computation Conference Companion
      July 2019
      2161 pages
      ISBN:9781450367486
      DOI:10.1145/3319619

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 13 July 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader