Skip to main content

The Hessian Estimation Evolution Strategy

  • Conference paper
  • First Online:
Parallel Problem Solving from Nature – PPSN XVI (PPSN 2020)

Abstract

We present a novel black box optimization algorithm called Hessian Estimation Evolution Strategy. The algorithm updates the covariance matrix of its sampling distribution by directly estimating the curvature of the objective function. This algorithm design is targeted at twice continuously differentiable problems. For this, we extend the cumulative step-size adaptation algorithm of the CMA-ES to mirrored sampling. We demonstrate that our approach to covariance matrix adaptation is efficient by evaluating it on the BBOB/COCO testbed. We also show that the algorithm is surprisingly robust when its core assumption of a twice continuously differentiable objective function is violated. The approach yields a new evolution strategy with competitive performance, and at the same time it also offers an interesting alternative to the usual covariance matrix update mechanism.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The decomposition is never computed explicitly in the algorithm. Instead it directly updates the factor A.

  2. 2.

    https://www.ini.rub.de/the_institute/people/tobias-glasmachers/#software.

References

  1. Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: IEEE Congress on Evolutionary Computation, vol. 2, pp. 1769–1776 (2005)

    Google Scholar 

  2. Beyer, H.G., Sendhoff, B.: Simplify your covariance matrix adaptation evolution strategy. IEEE Trans. Evol. Comput. 21(5), 746–759 (2017)

    Article  Google Scholar 

  3. Brockhoff, D., Auger, A., Hansen, N., Arnold, D.V., Hohm, T.: Mirrored sampling and sequential selection for evolution strategies. In: International Conference on Parallel Problem Solving from Nature. pp. 11–21. Springer (2010)

    Google Scholar 

  4. Glasmachers, T., Schaul, T., Sun, Y., Wierstra, D., Schmidhuber, J.: Exponential natural evolution strategies. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 393–400. ACM (2010)

    Google Scholar 

  5. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  6. Hansen, N., Auger, A., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting. Technical report arXiv:1603.08785 (2016)

  7. Hansen, N., Auger, A., Ros, R., Finck, S., Pošík, P.: Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009. In: Proceedings of the 12th Annual Conference Companion on Genetic and Evolutionary Computation, pp. 1689–1696 (2010)

    Google Scholar 

  8. Jebalia, M., Auger, A.: Log-Linear Convergence of the Scale-Invariant and Optimal (\(\mu \)/\(\mu \) w, \(\lambda \))-es and optimal \(\mu \) for Intermediate Recombination for Large Population Sizes. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 52–62. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_6

    Chapter  Google Scholar 

  9. Kern, S., Müller, S.D., Hansen, N., Büche, D., Ocenasek, J., Koumoutsakos, P.: Learning probability distributions in continuous evolutionary algorithms-a comparative review. Nat. Comput. 3(1), 77–112 (2004). https://doi.org/10.1023/B:NACO.0000023416.59689.4e

    Article  MathSciNet  MATH  Google Scholar 

  10. Krause, O., Glasmachers, T.: A CMA-ES with multiplicative covariance matrix updates. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) (2015)

    Google Scholar 

  11. Leventhal, D., Lewis, A.: Randomized hessian estimation and directional search. Optimization 60(3), 329–345 (2011)

    Article  MathSciNet  Google Scholar 

  12. Nocedal, J., Wright, S.: Numerical Optimization. Springer, Heidelberg (2006). https://doi.org/10.1007/978-0-387-40065-5

    Book  MATH  Google Scholar 

  13. Ollivier, Y., Arnold, L., Auger, A., Hansen, N.: Information-geometric optimization algorithms: a unifying picture via invariance principles. J. Mach. Learn. Res. 18(1), 564–628 (2017)

    MathSciNet  MATH  Google Scholar 

  14. Powell, M.: The NEWUOA software for unconstrained optimization without derivatives. Technical Rep. DAMTP 2004/NA05, Department of Applied Mathematics and Theoretical Physics, Cambridge University (2004)

    Google Scholar 

  15. Stich, S.U., Müller, C.L., Gärtner, B.: Variable metric random pursuit. Math. Program. 156, 549–579 (2015). https://doi.org/10.1007/s10107-015-0908-z

    Article  MathSciNet  MATH  Google Scholar 

  16. Teytaud, O., Gelly, S.: General lower bounds for evolutionary algorithms. In: Runarsson, T.P., Beyer, H.-G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 21–31. Springer, Heidelberg (2006). https://doi.org/10.1007/11844297_3

    Chapter  Google Scholar 

  17. Wang, H., Emmerich, M., Bäck, T.: Mirrored orthogonal sampling for covariance matrix adaptation evolution strategies. Evol. Comput. 27(4), 699–725 (2019)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tobias Glasmachers .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Glasmachers, T., Krause, O. (2020). The Hessian Estimation Evolution Strategy. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-58112-1_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-58111-4

  • Online ISBN: 978-3-030-58112-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics