Abstract
We present a novel black box optimization algorithm called Hessian Estimation Evolution Strategy. The algorithm updates the covariance matrix of its sampling distribution by directly estimating the curvature of the objective function. This algorithm design is targeted at twice continuously differentiable problems. For this, we extend the cumulative step-size adaptation algorithm of the CMA-ES to mirrored sampling. We demonstrate that our approach to covariance matrix adaptation is efficient by evaluating it on the BBOB/COCO testbed. We also show that the algorithm is surprisingly robust when its core assumption of a twice continuously differentiable objective function is violated. The approach yields a new evolution strategy with competitive performance, and at the same time it also offers an interesting alternative to the usual covariance matrix update mechanism.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
The decomposition is never computed explicitly in the algorithm. Instead it directly updates the factor A.
- 2.
References
Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: IEEE Congress on Evolutionary Computation, vol. 2, pp. 1769–1776 (2005)
Beyer, H.G., Sendhoff, B.: Simplify your covariance matrix adaptation evolution strategy. IEEE Trans. Evol. Comput. 21(5), 746–759 (2017)
Brockhoff, D., Auger, A., Hansen, N., Arnold, D.V., Hohm, T.: Mirrored sampling and sequential selection for evolution strategies. In: International Conference on Parallel Problem Solving from Nature. pp. 11–21. Springer (2010)
Glasmachers, T., Schaul, T., Sun, Y., Wierstra, D., Schmidhuber, J.: Exponential natural evolution strategies. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 393–400. ACM (2010)
Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)
Hansen, N., Auger, A., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting. Technical report arXiv:1603.08785 (2016)
Hansen, N., Auger, A., Ros, R., Finck, S., Pošík, P.: Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009. In: Proceedings of the 12th Annual Conference Companion on Genetic and Evolutionary Computation, pp. 1689–1696 (2010)
Jebalia, M., Auger, A.: Log-Linear Convergence of the Scale-Invariant and Optimal (\(\mu \)/\(\mu \) w, \(\lambda \))-es and optimal \(\mu \) for Intermediate Recombination for Large Population Sizes. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 52–62. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_6
Kern, S., Müller, S.D., Hansen, N., Büche, D., Ocenasek, J., Koumoutsakos, P.: Learning probability distributions in continuous evolutionary algorithms-a comparative review. Nat. Comput. 3(1), 77–112 (2004). https://doi.org/10.1023/B:NACO.0000023416.59689.4e
Krause, O., Glasmachers, T.: A CMA-ES with multiplicative covariance matrix updates. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) (2015)
Leventhal, D., Lewis, A.: Randomized hessian estimation and directional search. Optimization 60(3), 329–345 (2011)
Nocedal, J., Wright, S.: Numerical Optimization. Springer, Heidelberg (2006). https://doi.org/10.1007/978-0-387-40065-5
Ollivier, Y., Arnold, L., Auger, A., Hansen, N.: Information-geometric optimization algorithms: a unifying picture via invariance principles. J. Mach. Learn. Res. 18(1), 564–628 (2017)
Powell, M.: The NEWUOA software for unconstrained optimization without derivatives. Technical Rep. DAMTP 2004/NA05, Department of Applied Mathematics and Theoretical Physics, Cambridge University (2004)
Stich, S.U., Müller, C.L., Gärtner, B.: Variable metric random pursuit. Math. Program. 156, 549–579 (2015). https://doi.org/10.1007/s10107-015-0908-z
Teytaud, O., Gelly, S.: General lower bounds for evolutionary algorithms. In: Runarsson, T.P., Beyer, H.-G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 21–31. Springer, Heidelberg (2006). https://doi.org/10.1007/11844297_3
Wang, H., Emmerich, M., Bäck, T.: Mirrored orthogonal sampling for covariance matrix adaptation evolution strategies. Evol. Comput. 27(4), 699–725 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Glasmachers, T., Krause, O. (2020). The Hessian Estimation Evolution Strategy. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_41
Download citation
DOI: https://doi.org/10.1007/978-3-030-58112-1_41
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-58111-4
Online ISBN: 978-3-030-58112-1
eBook Packages: Computer ScienceComputer Science (R0)