ABSTRACT
Jump functions are the most studied non-unimodal benchmark in the theory of evolutionary algorithms (EAs). They have significantly improved our understanding of how EAs escape from local optima. However, their particular structure - to leave the local optimum the EA can only jump directly to the global optimum - raises the question of how representative the recent findings are.
For this reason, we propose an extended class Jumpk,δ of jump functions that incorporate a valley of low fitness of width δ starting at distance k from the global optimum. We prove that several previous results extend to this more general class: for all k = o (n1/3) and δ ≤ k, the optimal mutation rate for the (1 + 1) EA is [EQUATION], and the fast (1 + 1) EA runs faster than the classical (1 + 1) EA by a factor super-exponential in δ. However, we also observe that some known results do not generalize: the randomized local search algorithm with stagnation detection, which is faster than the fast (1 + 1) EA by a factor polynomial in k on Jumpk, is slower by a factor polynomial in n on some Jumpk,δ instances.
Computationally, the new class allows experiments with wider fitness valleys, especially when they lie further away from the global optimum.
- Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. Fast mutation in crossover-based algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1268--1276.Google ScholarDigital Library
- Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. First steps towards a runtime analysis when starting with a good solution. In Parallel Problem Solving From Nature, PPSN 2020, Part II. Springer, 560--573.Google Scholar
- Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2021. Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarDigital Library
- Denis Antipov and Benjamin Doerr. 2020. Runtime analysis of a heavy-tailed (1 + (λ, λ)) genetic algorithm on jump functions. In Parallel Problem Solving From Nature, PPSN 2020, Part II. Springer, 545--559.Google Scholar
- Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. 2020. The (1+ (λ, λ)) GA is even faster on multimodal problems. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1259--1267.Google ScholarDigital Library
- Anne Auger and Benjamin Doerr (Eds.). 2011. Theory of Randomized Search Heuristics. World Scientific Publishing.Google Scholar
- Henry Bambury, Antoine Bultel, and Benjamin Doerr. 2021. Generalized Jump Functions. CoRR abs/21xx.xxxxx (2021).Google Scholar
- Riade Benbaki, Ziyad Benomar, and Benjamin Doerr. 2021. A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarDigital Library
- Dogan Corus and Pietro S. Oliveto. 2018. Standard steady state genetic algorithms can hillclimb faster than mutation-only evolutionary algorithms. IEEE Transactions on Evolutionary Compututation 22 (2018), 720--732.Google ScholarDigital Library
- Dogan Corus, Pietro S. Oliveto, and Donya Yazdani. 2017. On the runtime analysis of the Opt-IA artificial immune system. In Genetic and Evolutionary Computation Conference, GECCO 2017. ACM, 83--90.Google ScholarDigital Library
- Dogan Corus, Pietro S. Oliveto, and Donya Yazdani. 2018. Fast artificial immune systems. In Parallel Problem Solving from Nature, PPSN 2018, Part II. Springer, 67--78.Google Scholar
- Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2016. Escaping local optima with diversity mechanisms and crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016. ACM, 645--652.Google Scholar
- Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2018. Escaping local optima using crossover with emergent diversity. IEEE Transactions on Evolutionary Computation 22 (2018), 484--497.Google ScholarCross Ref
- Benjamin Doerr. 2019. Analyzing randomized search heuristics via stochastic domination. Theoretical Computer Science 773 (2019), 115--137.Google ScholarDigital Library
- Benjamin Doerr. 2019. A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost. In Genetic and Evolutionary Computation Conference, GECCO 2019. ACM, 1488--1496.Google ScholarDigital Library
- Benjamin Doerr. 2020. Does comma selection help to cope with local optima?. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1304--1313.Google ScholarDigital Library
- Benjamin Doerr, Carola Doerr, and Franziska Ebel. 2015. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science 567 (2015), 87--104.Google ScholarDigital Library
- Benjamin Doerr, Carola Doerr, and Timo Kötzing. 2018. Static and self-adjusting mutation strengths for multi-valued decision variables. Algorithmica 80 (2018), 1732--1768.Google ScholarDigital Library
- Benjamin Doerr, Carola Doerr, and Timo Kötzing. 2019. Solving problems with unknown solution length at almost no extra cost. Algorithmica 81 (2019), 703--748.Google ScholarDigital Library
- Benjamin Doerr, Edda Happ, and Christian Klein. 2012. Crossover can provably be useful in evolutionary computation. Theoretical Computer Science 425 (2012), 17--33.Google ScholarDigital Library
- Benjamin Doerr and Timo Kötzing. 2021. Lower bounds from fitness levels made easy. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarDigital Library
- Benjamin Doerr and Martin S. Krejca. 2020. The univariate marginal distribution algorithm copes well with deception and epistasis. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2020. Springer, 51--66.Google Scholar
- Benjamin Doerr, Huu Phuoc Le, Régis Makhmara, and Ta Duy Nguyen. 2017. Fast genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2017. ACM, 777--784.Google ScholarDigital Library
- Benjamin Doerr and Frank Neumann (Eds.). 2020. Theory of Evolutionary Computation---Recent Developments in Discrete Optimization. Springer. Also available at https://cs.adelaide.edu.au/~frank/papers/TheoryBook2019-selfarchived.pdf.Google Scholar
- Benjamin Doerr and Weijie Zheng. 2021. Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In Conference on Artificial Intelligence, AAAI 2021. AAAI Press. To appear.Google ScholarDigital Library
- Stefan Droste, Thomas Jansen, and Ingo Wegener. 2002. On the analysis of the (1 + 1) evolutionary algorithm. Theoretical Computer Science 276 (2002), 51--81.Google ScholarDigital Library
- Simon Fischer and Ingo Wegener. 2004. The Ising model on the ring: mutation versus recombination. In Genetic and Evolutionary Computation, GECCO 2004. Springer, 1113--1124.Google Scholar
- Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. 2018. Evolutionary Algorithms and Submodular Functions: Benefits of Heavy-Tailed Mutations. CoRR abs/1805.10902 (2018).Google Scholar
- Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. 2018. Heavy-tailed mutation operators in single-objective combinatorial optimization. In Parallel Problem Solving from Nature, PPSN 2018, Part I. Springer, 134--145.Google Scholar
- Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Samadhi Nallaperuma, Frank Neumann, and Martin Schirneck. 2016. Fast building block assembly by majority vote crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016. ACM, 661--668.Google ScholarDigital Library
- Tobias Friedrich, Francesco Quinzan, and Markus Wagner. 2018. Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In Genetic and Evolutionary Computation Conference, GECCO 2018. ACM, 293--300.Google ScholarDigital Library
- Václav Hasenöhrl and Andrew M. Sutton. 2018. On the runtime dynamics of the compact genetic algorithm on jump functions. In Genetic and Evolutionary Computation Conference, GECCO 2018. ACM, 967--974.Google ScholarDigital Library
- Jens Jägersküpper and Tobias Storch. 2007. When the plus strategy outperforms the comma strategy and when not. In Foundations of Computational Intelligence, FOCI 2007. IEEE, 25--32.Google ScholarCross Ref
- Thomas Jansen. 2013. Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer.Google ScholarDigital Library
- Thomas Jansen and Ingo Wegener. 2002. The analysis of evolutionary algorithms - a proof that crossover really can help. Algorithmica 34 (2002), 47--66.Google ScholarDigital Library
- Per Kristian Lehre. 2010. Negative drift in populations. In Parallel Problem Solving from Nature, PPSN 2010. Springer, 244--253.Google ScholarDigital Library
- Per Kristian Lehre and Phan Trung Hai Nguyen. 2019. On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help. In Foundations of Genetic Algorithms, FOGA 2019. ACM, 154--168.Google ScholarDigital Library
- Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. 2019. On the time complexity of algorithm selection hyper-heuristics for multimodal optimisation. In Conference on Artificial Intelligence, AAAI 2019. AAAI Press, 2322--2329.Google ScholarDigital Library
- Frank Neumann and Carsten Witt. 2010. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer.Google ScholarDigital Library
- Pietro S. Oliveto, Tiago Paixão, Jorge Pérez Heredia, Dirk Sudholt, and Barbora Trubenová. 2018. How to escape local optima in black box optimisation: when non-elitism outperforms elitism. Algorithmica 80 (2018), 1604--1633.Google ScholarDigital Library
- Tiago Paixão, Jorge Pérez Heredia, Dirk Sudholt, and Barbora Trubenová. 2017. Towards a runtime comparison of natural and artificial evolution. Algorithmica 78 (2017), 681--713.Google ScholarDigital Library
- Amirhossein Rajabi and Carsten Witt. 2020. Self-adjusting evolutionary algorithms for multimodal optimization. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1314--1322.Google ScholarDigital Library
- Amirhossein Rajabi and Carsten Witt. 2021. Stagnation detection in highly multimodal fitness landscapes. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.Google ScholarDigital Library
- Amirhossein Rajabi and Carsten Witt. 2021. Stagnation detection with randomized local search. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2021. Springer, 152--168.Google ScholarDigital Library
- Jonathan E. Rowe and Aishwaryaprajna. 2019. The benefits and limitations of voting mechanisms in evolutionary optimisation. In Foundations of Genetic Algorithms, FOGA 2019. ACM, 34--42.Google ScholarDigital Library
- Dirk Sudholt. 2005. Crossover is provably essential for the Ising model on trees. In Genetic and Evolutionary Computation Conference, GECCO 2005. ACM, 1161--1167.Google ScholarDigital Library
- Dirk Sudholt. 2013. A new method for lower bounds on the running time of evolutionary algorithms. IEEE Transactions on Evolutionary Computation 17 (2013), 418--435.Google ScholarDigital Library
- Dirk Sudholt. 2017. How crossover speeds up building block assembly in genetic algorithms. Evolutionary Computation 25 (2017), 237--274.Google ScholarDigital Library
- Ingo Wegener. 2001. Theoretical aspects of evolutionary algorithms. In Automata, Languages and Programming, ICALP 2001. Springer, 64--78.Google Scholar
- Darrell Whitley, Swetha Varadarajan, Rachel Hirsch, and Anirban Mukhopadhyay. 2018. Exploration and exploitation without mutation: solving the jump function in Θ(n) time. In Parallel Problem Solving from Nature, PPSN 2018, Part II. Springer, 55--66.Google Scholar
- Carsten Witt. 2013. Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combinatorics, Probability & Computing 22 (2013), 294--318.Google ScholarCross Ref
- Mengxi Wu, Chao Qian, and Ke Tang. 2018. Dynamic mutation based Pareto optimization for subset selection. In Intelligent Computing Methodologies, ICIC 2018, Part III. Springer, 25--35.Google Scholar
Index Terms
- Generalized jump functions
Recommendations
More effective crossover operators for the all-pairs shortest path problem
The all-pairs shortest path problem is the first non-artificial problem for which it was shown that adding crossover can significantly speed up a mutation-only evolutionary algorithm. Recently, the analysis of this algorithm was refined and it was shown ...
Unbiased black-box complexities of jump functions: how to cross large plateaus
GECCO '14: Proceedings of the 2014 Annual Conference on Genetic and Evolutionary ComputationWe analyze the unbiased black-box complexity of jump functions with large jump sizes. Among other results, we show that when the jump size is (1/2 - epsilon)n, that is, only a small constant fraction of the fitness values is visible, then the unbiased ...
Too fast unbiased black-box algorithms
GECCO '11: Proceedings of the 13th annual conference on Genetic and evolutionary computationUnbiased black-box complexity was recently introduced as a refined complexity model for randomized search heuristics (Lehre and Witt, GECCO 2010). For several problems, this notion avoids the unrealistically low complexity results given by the classical ...
Comments