skip to main content
10.1145/3449639.3459367acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Generalized jump functions

Published:26 June 2021Publication History

ABSTRACT

Jump functions are the most studied non-unimodal benchmark in the theory of evolutionary algorithms (EAs). They have significantly improved our understanding of how EAs escape from local optima. However, their particular structure - to leave the local optimum the EA can only jump directly to the global optimum - raises the question of how representative the recent findings are.

For this reason, we propose an extended class Jumpk,δ of jump functions that incorporate a valley of low fitness of width δ starting at distance k from the global optimum. We prove that several previous results extend to this more general class: for all k = o (n1/3) and δk, the optimal mutation rate for the (1 + 1) EA is [EQUATION], and the fast (1 + 1) EA runs faster than the classical (1 + 1) EA by a factor super-exponential in δ. However, we also observe that some known results do not generalize: the randomized local search algorithm with stagnation detection, which is faster than the fast (1 + 1) EA by a factor polynomial in k on Jumpk, is slower by a factor polynomial in n on some Jumpk,δ instances.

Computationally, the new class allows experiments with wider fitness valleys, especially when they lie further away from the global optimum.

References

  1. Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. Fast mutation in crossover-based algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1268--1276.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. First steps towards a runtime analysis when starting with a good solution. In Parallel Problem Solving From Nature, PPSN 2020, Part II. Springer, 560--573.Google ScholarGoogle Scholar
  3. Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2021. Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Denis Antipov and Benjamin Doerr. 2020. Runtime analysis of a heavy-tailed (1 + (λ, λ)) genetic algorithm on jump functions. In Parallel Problem Solving From Nature, PPSN 2020, Part II. Springer, 545--559.Google ScholarGoogle Scholar
  5. Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. 2020. The (1+ (λ, λ)) GA is even faster on multimodal problems. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1259--1267.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Anne Auger and Benjamin Doerr (Eds.). 2011. Theory of Randomized Search Heuristics. World Scientific Publishing.Google ScholarGoogle Scholar
  7. Henry Bambury, Antoine Bultel, and Benjamin Doerr. 2021. Generalized Jump Functions. CoRR abs/21xx.xxxxx (2021).Google ScholarGoogle Scholar
  8. Riade Benbaki, Ziyad Benomar, and Benjamin Doerr. 2021. A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Dogan Corus and Pietro S. Oliveto. 2018. Standard steady state genetic algorithms can hillclimb faster than mutation-only evolutionary algorithms. IEEE Transactions on Evolutionary Compututation 22 (2018), 720--732.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Dogan Corus, Pietro S. Oliveto, and Donya Yazdani. 2017. On the runtime analysis of the Opt-IA artificial immune system. In Genetic and Evolutionary Computation Conference, GECCO 2017. ACM, 83--90.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Dogan Corus, Pietro S. Oliveto, and Donya Yazdani. 2018. Fast artificial immune systems. In Parallel Problem Solving from Nature, PPSN 2018, Part II. Springer, 67--78.Google ScholarGoogle Scholar
  12. Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2016. Escaping local optima with diversity mechanisms and crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016. ACM, 645--652.Google ScholarGoogle Scholar
  13. Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2018. Escaping local optima using crossover with emergent diversity. IEEE Transactions on Evolutionary Computation 22 (2018), 484--497.Google ScholarGoogle ScholarCross RefCross Ref
  14. Benjamin Doerr. 2019. Analyzing randomized search heuristics via stochastic domination. Theoretical Computer Science 773 (2019), 115--137.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Benjamin Doerr. 2019. A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost. In Genetic and Evolutionary Computation Conference, GECCO 2019. ACM, 1488--1496.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Benjamin Doerr. 2020. Does comma selection help to cope with local optima?. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1304--1313.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Benjamin Doerr, Carola Doerr, and Franziska Ebel. 2015. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science 567 (2015), 87--104.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Benjamin Doerr, Carola Doerr, and Timo Kötzing. 2018. Static and self-adjusting mutation strengths for multi-valued decision variables. Algorithmica 80 (2018), 1732--1768.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Benjamin Doerr, Carola Doerr, and Timo Kötzing. 2019. Solving problems with unknown solution length at almost no extra cost. Algorithmica 81 (2019), 703--748.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Benjamin Doerr, Edda Happ, and Christian Klein. 2012. Crossover can provably be useful in evolutionary computation. Theoretical Computer Science 425 (2012), 17--33.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Benjamin Doerr and Timo Kötzing. 2021. Lower bounds from fitness levels made easy. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Benjamin Doerr and Martin S. Krejca. 2020. The univariate marginal distribution algorithm copes well with deception and epistasis. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2020. Springer, 51--66.Google ScholarGoogle Scholar
  23. Benjamin Doerr, Huu Phuoc Le, Régis Makhmara, and Ta Duy Nguyen. 2017. Fast genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2017. ACM, 777--784.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Benjamin Doerr and Frank Neumann (Eds.). 2020. Theory of Evolutionary Computation---Recent Developments in Discrete Optimization. Springer. Also available at https://cs.adelaide.edu.au/~frank/papers/TheoryBook2019-selfarchived.pdf.Google ScholarGoogle Scholar
  25. Benjamin Doerr and Weijie Zheng. 2021. Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In Conference on Artificial Intelligence, AAAI 2021. AAAI Press. To appear.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Stefan Droste, Thomas Jansen, and Ingo Wegener. 2002. On the analysis of the (1 + 1) evolutionary algorithm. Theoretical Computer Science 276 (2002), 51--81.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Simon Fischer and Ingo Wegener. 2004. The Ising model on the ring: mutation versus recombination. In Genetic and Evolutionary Computation, GECCO 2004. Springer, 1113--1124.Google ScholarGoogle Scholar
  28. Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. 2018. Evolutionary Algorithms and Submodular Functions: Benefits of Heavy-Tailed Mutations. CoRR abs/1805.10902 (2018).Google ScholarGoogle Scholar
  29. Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. 2018. Heavy-tailed mutation operators in single-objective combinatorial optimization. In Parallel Problem Solving from Nature, PPSN 2018, Part I. Springer, 134--145.Google ScholarGoogle Scholar
  30. Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Samadhi Nallaperuma, Frank Neumann, and Martin Schirneck. 2016. Fast building block assembly by majority vote crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016. ACM, 661--668.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Tobias Friedrich, Francesco Quinzan, and Markus Wagner. 2018. Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In Genetic and Evolutionary Computation Conference, GECCO 2018. ACM, 293--300.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Václav Hasenöhrl and Andrew M. Sutton. 2018. On the runtime dynamics of the compact genetic algorithm on jump functions. In Genetic and Evolutionary Computation Conference, GECCO 2018. ACM, 967--974.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Jens Jägersküpper and Tobias Storch. 2007. When the plus strategy outperforms the comma strategy and when not. In Foundations of Computational Intelligence, FOCI 2007. IEEE, 25--32.Google ScholarGoogle ScholarCross RefCross Ref
  34. Thomas Jansen. 2013. Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Thomas Jansen and Ingo Wegener. 2002. The analysis of evolutionary algorithms - a proof that crossover really can help. Algorithmica 34 (2002), 47--66.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Per Kristian Lehre. 2010. Negative drift in populations. In Parallel Problem Solving from Nature, PPSN 2010. Springer, 244--253.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Per Kristian Lehre and Phan Trung Hai Nguyen. 2019. On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help. In Foundations of Genetic Algorithms, FOGA 2019. ACM, 154--168.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. 2019. On the time complexity of algorithm selection hyper-heuristics for multimodal optimisation. In Conference on Artificial Intelligence, AAAI 2019. AAAI Press, 2322--2329.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Frank Neumann and Carsten Witt. 2010. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer.Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Pietro S. Oliveto, Tiago Paixão, Jorge Pérez Heredia, Dirk Sudholt, and Barbora Trubenová. 2018. How to escape local optima in black box optimisation: when non-elitism outperforms elitism. Algorithmica 80 (2018), 1604--1633.Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Tiago Paixão, Jorge Pérez Heredia, Dirk Sudholt, and Barbora Trubenová. 2017. Towards a runtime comparison of natural and artificial evolution. Algorithmica 78 (2017), 681--713.Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Amirhossein Rajabi and Carsten Witt. 2020. Self-adjusting evolutionary algorithms for multimodal optimization. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1314--1322.Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Amirhossein Rajabi and Carsten Witt. 2021. Stagnation detection in highly multimodal fitness landscapes. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Amirhossein Rajabi and Carsten Witt. 2021. Stagnation detection with randomized local search. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2021. Springer, 152--168.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Jonathan E. Rowe and Aishwaryaprajna. 2019. The benefits and limitations of voting mechanisms in evolutionary optimisation. In Foundations of Genetic Algorithms, FOGA 2019. ACM, 34--42.Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Dirk Sudholt. 2005. Crossover is provably essential for the Ising model on trees. In Genetic and Evolutionary Computation Conference, GECCO 2005. ACM, 1161--1167.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Dirk Sudholt. 2013. A new method for lower bounds on the running time of evolutionary algorithms. IEEE Transactions on Evolutionary Computation 17 (2013), 418--435.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Dirk Sudholt. 2017. How crossover speeds up building block assembly in genetic algorithms. Evolutionary Computation 25 (2017), 237--274.Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Ingo Wegener. 2001. Theoretical aspects of evolutionary algorithms. In Automata, Languages and Programming, ICALP 2001. Springer, 64--78.Google ScholarGoogle Scholar
  50. Darrell Whitley, Swetha Varadarajan, Rachel Hirsch, and Anirban Mukhopadhyay. 2018. Exploration and exploitation without mutation: solving the jump function in Θ(n) time. In Parallel Problem Solving from Nature, PPSN 2018, Part II. Springer, 55--66.Google ScholarGoogle Scholar
  51. Carsten Witt. 2013. Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combinatorics, Probability & Computing 22 (2013), 294--318.Google ScholarGoogle ScholarCross RefCross Ref
  52. Mengxi Wu, Chao Qian, and Ke Tang. 2018. Dynamic mutation based Pareto optimization for subset selection. In Intelligent Computing Methodologies, ICIC 2018, Part III. Springer, 25--35.Google ScholarGoogle Scholar

Index Terms

  1. Generalized jump functions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '21: Proceedings of the Genetic and Evolutionary Computation Conference
      June 2021
      1219 pages
      ISBN:9781450383509
      DOI:10.1145/3449639

      Copyright © 2021 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 26 June 2021

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader