Skip to main content

Part of the book series: Springer Handbooks ((SHB))

Abstract

Evolution strategies (GlossaryTerm

ES

) are evolutionary algorithms that date back to the 1960s and that are most commonly applied to black-box optimization problems in continuous search spaces. Inspired by biological evolution, their original formulation is based on the application of mutation, recombination and selection in populations of candidate solutions. From the algorithmic viewpoint, GlossaryTerm

ES

are optimization methods that sample new candidate solutions stochastically, most commonly from a multivariate normal probability distribution. Their two most prominent design principles are unbiasedness and adaptive control of parameters of the sample distribution. In this overview, the important concepts of success based step-size control, self-adaptation, and derandomization are covered, as well as more recent developments such as covariance matrix adaptation and natural GlossaryTerm

ES

. The latter give new insights into the fundamental mathematical rationale behind GlossaryTerm

ES

. A broad discussion of theoretical results includes progress rate results on various function classes and convergence proofs for evolution

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 269.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 349.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abbreviations

CMA:

covariance matrix adaptation

CSA:

cumulative step-size adaptation

ES:

evolution strategy

i.i.d.:

independent, identically distributed

IPOP:

increasing population size

NES:

natural evolution strategy

xNES:

exponential natural evolution strategy

References

  1. I. Rechenberg: Evolutionstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (Frommann-Holzboog, Stuttgart 1973)

    Google Scholar 

  2. I. Rechenberg: Evolutionsstrategie '94 (Frommann-Holzboog, Stuttgart 1994)

    Google Scholar 

  3. H.-P. Schwefel: Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie (Birkhäuser, Basel 1977)

    Book  MATH  Google Scholar 

  4. H.-P. Schwefel: Evolution and Optimum Seeking (Wiley, New York 1995)

    MATH  Google Scholar 

  5. L.J. Fogel, A.J. Owens, M.J. Walsh: Artificial Intelligence through Simulated Evolution (Wiley, New York 1966)

    MATH  Google Scholar 

  6. H.-G. Beyer, H.-P. Schwefel: Evolution strategies — A comprehensive introduction, Nat. Comp. 1(1), 3–52 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. D.B. Fogel: The Fossil Record (Wiley, New York 1998)

    MATH  Google Scholar 

  8. H.-G. Beyer: The Theory of Evolution Strategies (Springer, Berlin, Heidelberg 2001)

    Book  MATH  Google Scholar 

  9. D.E. Goldberg: Genetic Algorithms in Search, Optimization and Machine Learning (Addison Wesley, Reading 1989)

    MATH  Google Scholar 

  10. N. Hansen, A. Ostermeier: Completely derandomized self-adaptation in evolution strategies, Evol. Comp. 9(2), 159–195 (2001)

    Article  Google Scholar 

  11. H.-P. Schwefel, G. Rudolph: Contemporary evolution strategies. In: Advances Artificial Life, ed. by F. Morán, A. Moreno, J.J. Merelo, P. Chacón (Springer, Berlin, Heidelberg 1995) pp. 891–907

    Chapter  Google Scholar 

  12. G. Rudolph: Convergence Properties of Evolutionary Algorithms (Dr. Kovač, Hamburg 1997)

    MATH  Google Scholar 

  13. D.V. Arnold: Weighted multirecombination evolution strategies, Theor. Comp. Sci. 361(1), 18–37 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  14. H. Mühlenbein, D. Schlierkamp-Voosen: Predictive models for the breeder genetic algorithm I. Continuous parameter optimization, Evol. Comp. 1(1), 25–49 (1993)

    Article  Google Scholar 

  15. H.-G. Beyer: Toward a theory of evolution strategies: On the benefits of sex — The (μ/μ, λ) theory, Evol. Comp. 3(1), 81–111 (1995)

    Article  MathSciNet  Google Scholar 

  16. C. Kappler: Are evolutionary algorithms improved by large mutations?, Lect. Notes Comput. Sci. 1141, 346–355 (1996)

    Article  Google Scholar 

  17. G. Rudolph: Local convergence rates of simple evolutionary algorithms with Cauchy mutations, IEEE Trans. Evol. Comp. 1(4), 249–258 (1997)

    Article  MathSciNet  Google Scholar 

  18. X. Yao, Y. Liu, G. Lin: Evolutionary programming made faster, IEEE Trans. Evol. Comp. 3(2), 82–102 (1999)

    Article  Google Scholar 

  19. M. Herdy: The number of offspring as strategy parameter in hierarchically organized evolution strategies, ACM SIGBIO Newsl. 13(2), 2–9 (1993)

    Article  Google Scholar 

  20. M. Schumer, K. Steiglitz: Adaptive step size random search, IEEE Trans. Autom. Control 13(3), 270–276 (1968)

    Article  Google Scholar 

  21. S. Kern, S.D. Müller, N. Hansen, D. Büche, J. Ocenasek, P. Koumoutsakos: Learning probability distributions in continuous evolutionary algorithms — A comparative review, Nat. Comput. 3(1), 77–112 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  22. N. Hansen: An analysis of mutative σ-self-adaptation on linear fitness functions, Evol. Comp. 14(3), 255–275 (2006)

    Article  Google Scholar 

  23. A. Ostermeier, A. Gawelczyk, N. Hansen: A derandomized approach to self-adaptation of evolution strategies, Evol. Comp. 2(4), 369–380 (1994)

    Article  Google Scholar 

  24. T. Runarsson: Reducing random fluctuations in mutative self-adaptation, Lect. Notes Comput. Sci. 2439, 194–203 (2002)

    Article  Google Scholar 

  25. A. Ostermeier, A. Gawelczyk, N. Hansen: Step-size adaptation based on non-local use of selection information, Lect. Notes Comput. Sci. 866, 189–198 (1994)

    Article  Google Scholar 

  26. N. Hansen, A. Ostermeier, A. Gawelczyk: On the adaptation of arbitrary normal mutation distributions in evolution strategies: The generating set adaptation, Int. Conf. Gen. Algorith., ed. by L.J. Eshelman (Morgan Kaufmann, San Francisco 1995) pp. 57–64

    Google Scholar 

  27. N. Hansen, A. Ostermeier: Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation, IEEE Int. Conf. Evol. Comp. (1996) pp. 312–317

    Google Scholar 

  28. A. Ostermeier, N. Hansen: An evolution strategy with coordinate system invariant adaptation of arbitrary normal mutation distributions within the concept of mutative strategy parameter control, Proc. Genet. Evol. Comput. Conf. (1999) pp. 902–909

    Google Scholar 

  29. D. Wierstra, T. Schaul, J. Peters, J. Schmidhuber: Natural evolution strategies, IEEE Cong. Evol. Comp. (CEC 2008) (2008) pp. 3381–3387

    Google Scholar 

  30. Y. Sun, D. Wierstra, T. Schaul, J. Schmidhuber: Efficient natural evolution strategies, Proc. Genet. Evol. Comput. Conf. (2009) pp. 539–546

    Google Scholar 

  31. T. Glasmachers, T. Schaul, Y. Sun, D. Wierstra, J. Schmidhuber: Exponential natural evolution strategies, Proc. Genet. Evol. Comput. Conf. (2010) pp. 393–400

    Google Scholar 

  32. N. Hansen: Invariance, self-adaptation and correlated mutations and evolution strategies, Lect. Notes Comput. Sci. 1917, 355–364 (2000)

    Article  Google Scholar 

  33. A. Ostermeier: An evolution strategy with momentum adaptation of the random number distribution. In: Proc. 2nd Conf. Parallel Probl. Solving Nat., ed. by R. Männer, B. Manderick (North-Holland, Amsterdam 1992) pp. 199–208

    Google Scholar 

  34. J. Poland, A. Zell: Main vector adaptation: A CMA variant with linear time and space complexity, Proc. Genet. Evol. Comput. Conf. (2001) pp. 1050–1055

    Google Scholar 

  35. J.N. Knight, M. Lunacek: Reducing the space-time complexity of the CMA-ES, Proc. Genet. Evol. Comput. Conf. (2007) pp. 658–665

    Google Scholar 

  36. N. Hansen, S. Kern: Evaluating the CMA evolution strategy on multimodal test functions, Lect. Notes Comput. Sci. 3242, 282–291 (2004)

    Article  Google Scholar 

  37. H.G. Beyer, B. Sendhoff: Covariance matrix adaptation revisited — The CMSA evolution strategy, Lect. Notes Comput. Sci. 3199, 123–132 (2008)

    Article  Google Scholar 

  38. G.A. Jastrebski, D.V. Arnold: Improving evolution strategies through active covariance matrix adaptation, IEEE Cong. Evol. Comp. (CEC 2006) (2006) pp. 2814–2821

    Google Scholar 

  39. H.-G. Beyer: Mutate large, but inherit small! On the analysis of rescaled mutations in ($\smash{\tilde{1},\tilde{\lambda}}$)-ES with noisy fitness data, Lect. Notes Comput. Sci. 1498, 109–118 (1998)

    Article  Google Scholar 

  40. S.I. Amari: Natural gradient works efficiently in learning, Neural Comput. 10(2), 251–276 (1998)

    Article  MathSciNet  Google Scholar 

  41. Y. Sun, D. Wierstra, T. Schaul, J. Schmidhuber: Stochastic search using the natural gradient, Int. Conf. Mach. Learn., ed. by A.P. Danyluk, L. Bottou, M.L. Littman (2009) pp. 1161–1168

    Google Scholar 

  42. Y. Akimoto, Y. Nagata, I. Ono, S. Kobayashi: Bidirectional relation between CMA evolution strategies and natural evolution strategies, Lect. Notes Comput. Sci. 6238, 154–163 (2010)

    Google Scholar 

  43. L. Arnold, A. Auger, N. Hansen, Y. Ollivier: Information-geometric optimization algorithms: A unifying picture via invariance principles, ArXiv e-prints (2011), DOI arXiv:1106.3708

    Google Scholar 

  44. M. Pelikan, M.W. Hausschild, F.G. Lobo: Introduction to estimation of distribution algorithms. MEDAL Rep. No. 2012003 (University of Missouri, St. Louis 2012)

    Google Scholar 

  45. G.R. Harik, F.G. Lobo: A parameter-less genetic algorithm, Proc. Genet. Evol. Comput. Conf. (1999) pp. 258–265

    Google Scholar 

  46. F.G. Lobo, D.E. Goldberg: The parameter-less genetic algorithm in practice, Inf. Sci. 167(1), 217–232 (2004)

    Article  MATH  Google Scholar 

  47. A. Auger, N. Hansen: A restart CMA evolution strategy with increasing population size, IEEE Cong. Evol. Comp. (CEC 2005) (2005) pp. 1769–1776

    Chapter  Google Scholar 

  48. T. Suttorp, N. Hansen, C. Igel: Efficient covariance matrix update for variable metric evolution strategies, Mach. Learn. 75(2), 167–197 (2009)

    Article  Google Scholar 

  49. Z. Michalewicz, M. Schoenauer: Evolutionary algorithms for constrained parameter optimizationproblems, Evol. Comp. 4(1), 1–32 (1996)

    Article  Google Scholar 

  50. E. Mezura-Montes, C.A. Coello Coello: Constraint-handling in nature-inspired numerical optimization: Past, present, and future, Swarm Evol. Comp. 1(4), 173–194 (2011)

    Article  Google Scholar 

  51. M. Emmerich, A. Giotis, M. Özdemir, T. Bäck, K. Giannakoglou: Metamodel-assisted evolution strategies, Lect. Notes Comput. Sci. 2439, 361–370 (2002)

    Article  Google Scholar 

  52. C. Igel, N. Hansen, S. Roth: Covariance matrix adaptation for multi-objective optimization, Evol. Comp. 15(1), 1–28 (2007)

    Article  Google Scholar 

  53. N. Hansen, T. Voß, C. Igel: Improved step size adaptation for the MO-CMA-ES, Proc. Genet. Evol. Comput. Conf. (2010) pp. 487–494

    Google Scholar 

  54. R. Salomon: Evolutionary algorithms and gradient search: Similarities and differences, IEEE Trans. Evol. Comp. 2(2), 45–55 (1998)

    Article  Google Scholar 

  55. G. Rudolph: Self-adaptive mutations may lead to premature convergence, IEEE Trans. Evol. Comp. 5(4), 410–414 (2001)

    Article  Google Scholar 

  56. A. Auger, N. Hansen: Reconsidering the progress rate theory for evolution strategies in finite dimensions, Proc. Genet. Evol. Comput. Conf. (2006) pp. 445–452

    Google Scholar 

  57. O. Teytaud, S. Gelly: General lower bounds for evolutionary algorithms, Lect. Notes Comput. Sci. 4193, 21–31 (2006)

    Article  Google Scholar 

  58. H. Fournier, O. Teytaud: Lower bounds for comparison based evolution strategies using VC-dimension and sign patterns, Algorithmica 59(3), 387–408 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  59. O. Teytaud: Lower bounds for evolution strategies. In: Theory of Randomized Search Heuristics: Foundations and Recent Developments, ed. by A. Auger, B. Doerr (World Scientific Publ., Singapore 2011) pp. 327–354

    Chapter  Google Scholar 

  60. J. Jägersküpper: Lower bounds for hit-and-run direct search, Lect. Notes Comput. Sci. 4665, 118–129 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  61. J. Jägersküpper: Lower bounds for randomized direct search with isotropic sampling, Oper. Res. Lett. 36(3), 327–332 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  62. J. Jägersküpper: Probabilistic runtime analysis of $(1\stackrel{+}{,}\lambda)$ evolution strategies using isotropic mutations, Proc. Genet. Evol. Comput. Conf. (2006) pp. 461–468

    Google Scholar 

  63. M. Jebalia, A. Auger, P. Liardet: Log-linear convergence and optimal bounds for the (1+1)-ES, Lect. Notes Comput. Sci. 4926, 207–218 (2008)

    Article  MathSciNet  Google Scholar 

  64. A. Auger, N. Hansen: Theory of evolution strategies: A new perspective. In: Theory of Randomized Search Heuristics: Foundations and Recent Developments, ed. by A. Auger, B. Doerr (World Scientific Publ., Singapore 2011) pp. 289–325

    Chapter  Google Scholar 

  65. A. Auger, D. Brockhoff, N. Hansen: Analyzing the impact of mirrored sampling and sequential selection in elitist evolution strategies, Proc. 11th Workshop Found. Gen. Algorith. (2011) pp. 127–138

    Google Scholar 

  66. A. Auger, D. Brockhoff, N. Hansen: Mirrored sampling in evolution strategies with weighted recombination, Proc. Genet. Evol. Comput. Conf. (2011) pp. 861–868

    Google Scholar 

  67. D.V. Arnold, H.-G. Beyer: Local performance of the $(\mu/\mu _{I},\lambda)$-ES in a noisy environment, Proc. 11th Workshop Found. Gen. Algorith. (2001) pp. 127–141

    Google Scholar 

  68. D.V. Arnold, H.-G. Beyer: Local performance of the $(1+1)$-ES in a noisy environment, IEEE Trans. Evol. Comp. 6(1), 30–41 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  69. D.V. Arnold: Noisy Optimization with Evolution Strategies (Kluwer Academic, Boston 2002)

    Book  MATH  Google Scholar 

  70. D.V. Arnold, H.-G. Beyer: Performance analysis of evolutionary optimization with cumulative step length adaptation, IEEE Trans. Autom. Control 49(4), 617–622 (2004)

    Article  MathSciNet  Google Scholar 

  71. A.I. Oyman, H.-G. Beyer: Analysis of the $(\mu/\mu,\lambda)$-ES on the parabolic ridge, Evol. Comp. 8(3), 267–289 (2000)

    Article  Google Scholar 

  72. D.V. Arnold, H.-G. Beyer: Evolution strategies with cumulative step length adaptation on the noisy parabolic ridge, Nat. Comput. 7(4), 555–587 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  73. D.V. Arnold, H.-G. Beyer: On the behaviour of evolution strategies optimising cigar functions, Evol. Comp. 18(4), 661–682 (2010)

    Article  Google Scholar 

  74. H.-G. Beyer: Towards a theory of evolution strategies: Self-adaptation, Evol. Comp. 3, 3 (1995)

    Google Scholar 

  75. S. Meyer-Nieberg, H.-G. Beyer: Mutative self-adaptation on the sharp and parabolic ridge, Lect. Notes Comput. Sci. 4436, 70–96 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  76. D.V. Arnold, A. MacLeod: Hierarchically organised evolution strategies on the parabolic ridge, Proc. Genet. Evol. Comput. Conf. (2006) pp. 437–444

    Google Scholar 

  77. H.-G. Beyer, M. Dobler, C. Hämmerle, P. Masser: On strategy parameter control by meta-ES, Proc. Genet. Evol. Comput. Conf. (2009) pp. 499–506

    Google Scholar 

  78. D.V. Arnold, A. MacLeod: Step length adaptation on ridge functions, Evol. Comp. 16(2), 151–184 (2008)

    Article  Google Scholar 

  79. D.V. Arnold: On the use of evolution strategies for optimising certain positive definite quadratic forms, Proc. Genet. Evol. Comput. Conf. (2007) pp. 634–641

    Google Scholar 

  80. H.-G. Beyer, S. Finck: Performance of the $(\mu/\mu _{{I}},\lambda)\hbox{-}\sigma{\rm SA}$-ES on a class of PDQFs, IEEE Trans. Evol. Comp. 14(3), 400–418 (2010)

    Article  Google Scholar 

  81. M. Jebalia, A. Auger, N. Hansen: Log-linear convergence and divergence of the scale-invariant (1+1)-ES in noisy environments, Algorithmica 59(3), 425–460 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  82. D.V. Arnold, H.-G. Beyer: On the benefits of populations for noisy optimization, Evol. Comp. 11(2), 111–127 (2003)

    Article  Google Scholar 

  83. D.V. Arnold, H.-G. Beyer: A general noise model and its effects on evolution strategy performance, IEEE Trans. Evol. Comp. 10(4), 380–391 (2006)

    Article  Google Scholar 

  84. D.V. Arnold, H.-G. Beyer: Optimum tracking with evolution strategies, Evol. Comp. 14(3), 291–308 (2006)

    Article  Google Scholar 

  85. D.V. Arnold, D. Brauer: On the behaviour of the $(1+1)$-ES for a simple constrained problem, Lect. Notes Comput. Sci. 5199, 1–10 (2008)

    Article  Google Scholar 

  86. D.V. Arnold: On the behaviour of the $(1,\lambda)$-ES for a simple constrained problem, Lect. Notes Comput. Sci. 5199, 15–24 (2011)

    MathSciNet  Google Scholar 

  87. D.V. Arnold: Analysis of a repair mechanism for the $(1,\lambda)$-ES applied to a simple constrained problem, Proc. Genet. Evol. Comput. Conf. (2011) pp. 853–860

    Google Scholar 

  88. A. Anatoly Zhigljavsky: Theory of Global Random Search (Kluwer Academic, Boston 1991)

    Book  MATH  Google Scholar 

  89. A. Bienvenüe, O. François: Global convergence for evolution strategies in spherical problems: Some simple proofs and difficulties, Theor. Comp. Sci. 306(1-3), 269–289 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  90. A. Auger: Convergence results for (1,λ)-SA-ES using the theory of $\varphi$-irreducible Markov chains, Theor. Comp. Sci. 334(1-3), 35–69 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  91. J. Jägersküpper: Algorithmic analysis of a basic evolutionary algorithm for continuous optimization, Theor. Comp. Sci. 379(3), 329–347 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  92. J. Jägersküpper: How the (1+1) ES using isotropic mutations minimizes positive definite quadratic forms, Theor. Comp. Sci. 361(1), 38–56 (2006)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nikolaus Hansen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Hansen, N., Arnold, D.V., Auger, A. (2015). Evolution Strategies. In: Kacprzyk, J., Pedrycz, W. (eds) Springer Handbook of Computational Intelligence. Springer Handbooks. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-43505-2_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-43505-2_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-43504-5

  • Online ISBN: 978-3-662-43505-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics