Evolution strategies with exclusion-based selection operators and a Fourier series auxiliary function

https://doi.org/10.1016/j.amc.2005.06.003Get rights and content

Abstract

To improve the efficiencies of evolutionary algorithms (EAs), we have proposed a highly efficient speed-up strategy in our previous research work: the exclusion-based selection operators. These operators could efficiently prevent the individuals of EAs from getting into the attractions of local optima through the search space shrinking method. However, when a global optimum of a minimization problem is located in a very narrow attraction, the exclusion-based selection operators may not be able to find this narrow attraction and delete this global optimum mistakenly, making the algorithm unreliable. In this paper, we propose a new complementary efficient speed-up strategy—the Fourier series auxiliary function. This strategy could guide an algorithm to search for optima with narrow attractions efficiently and effectively, and compensate the deficiency of the exclusion-based selection operators on the algorithm’s reliability. We combine these two strategies together to search the global optima in parallel, one for optima in normal attractions and the other for optima in very narrow attractions respectively. Incorporation of these two strategies with any known evolutionary algorithm leads to an accelerated version of the algorithm. As a case study, the new strategies have been incorporated into evolution strategies (ES), yielding an accelerated exclusion and Fourier series auxiliary function ES: the EFES. The EFES is experimentally tested with a test suite containing 10 complex multimodal function optimization problems and compared against the standard ES (SES) and the fast ES (FES). The experiments all demonstrate that the EFES consistently and significantly outperforms other two ES in efficiency and solution quality.

Introduction

Evolutionary algorithms (EAs) are global search procedures based on the evolution of a set of solutions viewed as a population of interacting individuals. These algorithms include genetic algorithms (GAs), genetic programming (GP), evolution strategies (ES) and evolution programming (EP) [9], [18], [19], [20]. EAs have broad applications and successes as tools for search and optimization. But for solving large scale and complex optimization problems, EAs have not demonstrated themselves to be very efficient. Particularly their efficiencies have been criticized [17], [27]. We believe the main factor which causes low efficiency of the current EAs is the convergence toward undesired attractions. This phenomenon occurs when the objective function has some local optima with normal attractions or its global optimum is located in a narrow attraction in a minimization case. The relationship between the convergence to a global minimum and the geometry (landscape) of the minimization problem is very important. If the population of EAs gets trapped in the suboptimal states, which are located in comparative large attractions, then it is difficult for the variation operators to produce an offspring which outperforms its parents. In the second case, if the global optima are located in relatively narrow attractions, and the individuals of EAs have not found these attractions yet, the possibility of the variation operators to produce offspring which locate in these narrow attractions is quite low. In both cases, the stochastic mechanism of EAs also yields unavoidable resampling, which increases the algorithm’s complexity and decreases the search efficiency. Many previous studies have investigated such issues for genetic algorithms [23], [28], their parallel versions [13] and parameter control [1], evolutionary programming [22], evolution strategies [10], and evolutionary hybrid approaches [11], [14], [15].

In our previous research work [26], we have proposed highly efficient and practical strategies to establish accelerated EAs for solving global optimization problems, called the exclusion-based selection operators. These strategies could carry out computationally verifiable tests on “prospects” (or non-existence) of global optimum solutions in the cells. Any less-prospective cells could be excluded and EAs could concentrate on the highly prospective solution space in the search process. The exclusion-based selection operators could effectively prevent the individuals of EAs from resampling and getting into the attractions of local optima, therefore, could accelerate the convergent speed of EAs. Through the space shrinking strategies, EAs with the exclusion-based selection operators could significantly enhance the efficiency and the precision of the solutions. However, when a global optimum of a minimization problem is located in an extremely narrow attraction, the exclusion-based selection operators may not be able to find this narrow attraction and delete this global optimum accidentally, which causes the algorithm to be unreliable.

In this paper, we propose a new complimentary efficient and practical strategy—a Fourier series auxiliary function, that can enlarge the narrow attractions of the optima and flatten the large attractions. This auxiliary function can guide an algorithm to search the optima with narrow attractions more efficient, while such optima are difficult to be found in the objective function by EAs. Furthermore, this strategy runs in parallel with the exclusion-based selection operators and compensates the deficiency of the exclusion-based selection operators on the algorithm’s risk of missing optima with very narrow attractions.

Integrating these two complementary strategies can be considered as searching in two scales in parallel, one for optima in normal attractions and one for optima in very narrow attractions respectively. In the case study, the integrated strategies are incorporated into evolution strategies (ES), yielding a new type of accelerated exclusion and Fourier series auxiliary function ES: the EFES. Simulation examples all demonstrate that the new ES consistently and significantly outperforms the previous ES in efficiency and solution quality, particularly for the complex problems with optima in narrow attractions.

The paper is organized as follows: we will briefly introduce the set of “exclusion-based” selection operators in Section 2. Then the Fourier series auxiliary function is proposed in Section 3. Incorporation of the integrated strategies proposed with any known EA can lead to an accelerated version of the algorithm. We present in Section 4, as a case study, the accelerated version of evolution strategies—the evolution strategies with exclusion-based selection operators and a Fourier series auxiliary function (EFES). The EFES is then experimentally examined, analyzed and compared in Section 5 with a suite of typical and difficult, multimodal function optimization problems. The paper concludes in Section 6 with some useful remarks on future research related to the present work.

Section snippets

Exclusion-based selection operators

This section introduces a somewhat different selection mechanism—the exclusion-based selection operators [26].

Consider any EA solving an optimization problem, say,(P)min{f(x):xΩ},where f:Ω  Rn  R is a function. For simplicity, consider the problem (P) with the domain Ω specified by Ω = [u1,v1]∗[u2, v2]∗⋯∗[un, vn]. The new scheme is based on the cellular partition methodology. Given an integer d, let hi=vi-uid, we defineσ(j1,j2,,jn)={x=(x1,x2,,xn)Ω:(ji-1)×hixi-uiji×hi,1jid}and let the

A Fourier series auxiliary function

The exclusion-based selection operators are the efficient accelerating operators based on the interval arithmetic and the cell mapping methods. However, if the function optimization problem is very complex, say there are many narrow optimal attractions, the optima may be excluded by mistake using the above operators. In any case, searching an optimum with a narrow attraction is difficult for evolutionary algorithms. Furthermore, it is the main reason that renders the exclusion-based selection

A case study: EFES

The exclusion-based selection operators could prevent the population of evolutionary algorithms (EAs) getting into the large attractions of local optima and speed up converge through the space shrinking strategy. But the global optimum with a narrow attraction might be deleted by mistake. The Fourier series auxiliary function g(x) guides the population of EAs to find the optima with narrow attractions more efficiently through enlarging narrow attractions and flattening large attractions of f(x

Simulations and comparisons

We experimentally evaluate the performance of the EFES, and compare it with the (μ,λ) standard evolution strategies (SES) [9] and the fast evolutionary strategies (FES) [21]. All experiments are conducted on a Pentium IV 1.4G computer.

The EFES has been implemented with N = 1000 and ε1 = ε2 = 10−8. The maximum number M of the ES evolution was taken uniformly to be 500 in each epoch of the EFES. In the experiments, all the five exclusion operators (E1)–(E5) were applied and the “Orange-peeling

Conclusion

This paper has suggested, analyzed, and explored a new complementary efficient strategy for evolutionary computation—the Fourier series auxiliary function strategy. Integrating this strategy with the exclusion-based selection operators, which are very fast strategies, forms a unique hybridized optimization approach searching in two scales in parallel for optima in normal and very narrow attractions respectively. Any known evolutionary algorithm incorporated with these integrated strategies may

Acknowledgment

This research was partially supported by RGC Earmarked Grant 4192/03E of Hong Kong SAR and RGC Research Grant Direct Allocation of the Chinese University of Hong Kong.

References (28)

  • H.G. Beyer

    Toward a theory of evolution strategies: on the benefit of sex—the (μ, μλ)-theory

    Evolution. Comput.

    (1995)
  • H.P. Schwefel

    Evolution and Optimum Seeking

    (1995)
  • H.P. Schwefel et al.

    Contemparary evolution strategies

  • K.S. Leung et al.

    A new model of simulated evolutionary computation: convergence analysis and specifications

    IEEE Trans. Evolution. Comput.

    (2001)
  • Cited by (5)

    • Co-evolving bee colonies by forager migration: A multi-swarm based Artificial Bee Colony algorithm for global search space

      2014, Applied Mathematics and Computation
      Citation Excerpt :

      Others have studied swarm intelligence [2] to deploy the simple characteristic traits of group agents like birds, bees, ants to successfully perform optimization in domains widely varying as well as complex. In the first category algorithms like Evolutionary Programming [3,4], Evolution Strategy [5,6], Genetic Programming [7,8] and most commonly Genetic Algorithm [9,10] are used widely. With the introduction of Differential Evolution (DE), proposed by Storn and Price [11–13], evolutionary algorithms assumed widespread popularity.

    • Evolutionary programming based on non-uniform mutation

      2007, Applied Mathematics and Computation
    • An new efficient evolutionary approach for dynamic optimization problems

      2009, Proceedings - 2009 IEEE International Conference on Intelligent Computing and Intelligent Systems, ICIS 2009
    • Neural Network training using genetic algorithm with a novel binary encoding

      2007, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    View full text