Do additional target points speed up evolutionary algorithms?☆
Section snippets
Motivation
Runtime analysis has emerged as a fruitful research area that is helping to develop and consolidate our understanding of the performance of evolutionary algorithms and many other randomised search heuristics [1], [2], [3], [4].
Many results have been obtained for problems from combinatorial optimisation [1] and for pseudo-Boolean functions . The latter includes frequently used benchmark functions like OneMax, linear functions, LeadingOnes, Ridge, Jump, Cliff, Plateau (see Section 2 for
Preliminaries
We consider the very general class of all black-box algorithms shown in Algorithm 1 that comprises meta-heuristics such as evolutionary algorithms, ant colony optimisation, particle swarm optimisation, estimation-of-distribution algorithms, artificial immune systems, tabu search, simulated annealing, and many more. Here denotes the set of optimal solutions for the function f and S is a set of additional target points. Note that in the classical setting .
We also show more specific
Random and worst-case target points
We first consider the simple scenario where additional target points are added uniformly at random as an average case scenario for the placement of additional optima. It is not surprising that these additional targets do not help much if they only make up a tiny fraction of the whole search space.
Adding targets around optima
We now consider scenarios where additional optima or target points are placed “close” to global optima. There are several possible notions of “close”. We could aim to reach a solution of a specified minimum fitness. This scenario is highly relevant for practice and has been investigated implicitly in several works (e.g. [43], [44]). Recently, it was studied explicitly under the term fixed target runtime analysis [18].
We cite a fixed-target result for LeadingOnes for illustration. It is notable
Best-case placement of additional optima
Finally, we consider a best possible placement of additional target points, and how much the expected running time can be decreased by carefully choosing additional targets.
A rather obvious example of a huge benefit through added optima is the function Trap, a deceptive function on which the (1+1) EA requires time [11]. Turning the trap into a target point turns the function into OneMax with an additional optimum at . Theorem 5.1 The expected time for the (1+1) EA operating on Trap to find
Conclusions
Runtime analysis of randomised search heuristics concentrates on the expected number of fitness function evaluations until a certain target point (often a unique global optimum) is hit for the first time. We studied how the expected optimisation time changes if, in addition to global optima, additional target points are considered.
Our results points out that it depends on the size and placement of the target points as well as characteristics of the fitness function. We considered a worst-case
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References (53)
- et al.
Real royal road functions—where crossover provably is essential
Discrete Appl. Math.
(2005) - et al.
On the analysis of the (1+1) evolutionary algorithm
Theor. Comput. Sci.
(2002) - et al.
Analysis of the (1+1) EA on subclasses of linear functions under uniform and linear constraints
Theor. Comput. Sci.
(2020) Population size versus runtime of a simple evolutionary algorithm
Theor. Comput. Sci.
(2008)The impact of parametrization in memetic evolutionary algorithms
Theor. Comput. Sci.
(2009)- et al.
The choice of the offspring population size in the (1,λ) evolutionary algorithm
Theor. Comput. Sci.
(2014) - et al.
The one-dimensional Ising model: mutation versus recombination
Theor. Comput. Sci.
(2005) - et al.
Optimization with randomized search heuristics—the (A)NFL theorem, realistic scenarios, and difficult functions
Theor. Comput. Sci.
(2002) - et al.
Runtime analysis of a binary particle swarm optimizer
Theor. Comput. Sci.
(2010) - et al.
Bioinspired Computation in Combinatorial Optimization – Algorithms and Their Computational Complexity
(2010)
Analyzing Evolutionary Algorithms – The Computer Science Perspective
On the choice of the offspring population size in evolutionary algorithms
Evol. Comput.
Drift analysis and evolutionary algorithms revisited
Comb. Probab. Comput.
Mutation rate matters even when optimizing monotonic functions
Evol. Comput.
A general dichotomy of evolutionary algorithms on monotone functions
IEEE Trans. Evol. Comput.
Exponential slowdown for larger populations: the ()-EA on monotone functions
A new method for lower bounds on the running time of evolutionary algorithms
IEEE Trans. Evol. Comput.
Lower bounds from fitness levels made easy
Upper and lower bounds for randomized search heuristics in black-box optimization
Theory Comput. Syst.
Black-box search by unbiased variation
Algorithmica
Complexity theory for discrete black-box optimization heuristics
Fixed-target runtime analysis
Fixed-target runtime analysis
Algorithmica
Runtime analysis for maximizing population diversity in single-objective optimization
Cited by (1)
Self-adjusting offspring population sizes outperform fixed parameters on the cliff function
2024, Artificial Intelligence
- ☆
This article belongs to Section C: Theory of natural computing, Edited by Lila Kari.