ABSTRACT
Evolutionary Algorithms (EAs) and other metaheuristics are greatly affected by the choice of their parameters, not only as regards the precision of the solutions found, but also for repeatability, robustness, speed of convergence, and other properties. Most of these performance criteria are often conflicting with one another. In our work, we see the problem of EAs' parameter selection and tuning as a multi-objective optimization problem, in which the criteria to be optimized are precision and speed of convergence. We propose EMOPaT (Evolutionary Multi-Objective Parameter Tuning), a method that uses a well-known multi-objective optimization algorithm (NSGA-II) to find a front of non-dominated parameter sets which produce good results according to these two metrics.
By doing so, we can provide three kinds of results: (i) a method that is able to adapt parameters to a single function, (ii) a comparison between Differential Evolution (DE) and Particle Swarm Optimization (PSO) that takes into consideration both precision and speed, and (iii) an insight into how parameters of DE and PSO affect the performance of these EAs on different benchmark functions.
- T. B\"ack. Parallel optimization of evolutionary algorithms. In Parallel Problem Solving From Nature, pages 418--427. Spring-Verlag, 1994. Google ScholarDigital Library
- J. Branke and J. A. Elomari. Meta-optimization for parameter tuning with a flexible computing budget. In Proceedings of the Fourteenth International Conference on Genetic and Evolutionary Computation Conference, GECCO, pages 1245--1252. ACM, 2012. Google ScholarDigital Library
- S. Das and P. Suganthan. Differential Evolution: A survey of the state-of-the-art. IEEE Transactions on Evolutionary Computation, 15(1):4--31, 2011. Google ScholarDigital Library
- K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2):182--197, 2002. Google ScholarDigital Library
- J. Dréo. Using performance fronts for parameter setting of stochastic metaheuristics. In Proceedings of the Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, pages 2197--2200. ACM, 2009. Google ScholarDigital Library
- A. E. Eiben and S. K. Smit. Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm and Evolutionary Computation, 1(1):19 -- 31, 2011.Google ScholarCross Ref
- R. Gämperle, S. D. Müller, and P. Koumoutsakos. A parameter study for Differential Evolution. In Int. Conf. on Advances in Intelligent Systems, Fuzzy Systems, Evolutionary Computation, pages 293--298, 2002.Google Scholar
- J. Grefenstette. Optimization of control parameters for genetic algorithms. IEEE Trans. Syst. Man Cybern., 16(1):122--128, 1986. Google ScholarDigital Library
- J. Kennedy and M. Clerc, 2006. Online: http://www.particleswarm.info/Standard_PSO_2006.c.Google Scholar
- J. Lampinen and I. Zelinka. On stagnation of the differential evolution algorithm. In Proceedings of MENDEL, pages 76--83, 2000.Google Scholar
- R. Mallipeddi and P. Suganthan. Empirical study on the effect of population size on Differential Evolution algorithm. In IEEE Congress on Evolutionary Computation, pages 3663--3670, 2008.Google Scholar
- J. L. F. Martínez and E. G. Gonzalo. The generalized PSO: A new door to PSO evolution. J. Artif. Evol. App., 2008:5:1--5:15, Jan. 2008. Google ScholarCross Ref
- M. Meissner, M. Schmuker, and G. Schneider. Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training. BMC Bioinformatics, 7, 2006.Google Scholar
- R. Mercer and J. Sampson. Adaptive search using a reproductive metaplan. Kybernetes, 7:215--228, 1978.Google ScholarCross Ref
- V. Nannen and A. Eiben. A method for parameter calibration and relevance estimation in evolutionary algorithms. In Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pages 183--190, 2006. Google ScholarDigital Library
- Y. Nashed, P. Mesejo, R. Ugolotti, J. Dubois-Lacoste, and S. Cagnoni. A comparative study of three GPU-based metaheuristics. In Parallel Problem Solving from Nature, volume 7492 of Lecture Notes in Computer Science, pages 398--407. Springer, 2012. Google ScholarDigital Library
- M. E. H. Pedersen. Tuning and simplifying heuristical optimization. Master's thesis, University of Southampton, 2010.Google Scholar
- R. Poli, J. Kennedy, and T. Blackwell. Particle Swarm Optimization. Swarm Intelligence, 1(1):33--57, 2007.Google ScholarCross Ref
- S. K. Smit and A. E. Eiben. Comparing parameter tuning methods for evolutionary algorithms. In Proc. of the IEEE Congress on Evolutionary Computation, pages 399--406, 2009. Google ScholarDigital Library
- S. K. Smit and A. E. Eiben. Beating the 'world champion' evolutionary algorithm via REVAC tuning. In Proc. of IEEE Congress on Evolutionary Computation, pages 1--8, 2010.Google ScholarCross Ref
- S. K. Smit, A. E. Eiben, and Z. Szlávik. An MOEA-based method to tune EA parameters on multiple objective functions. In IJCCI (ICEC), pages 261--268, 2010.Google Scholar
- R. Storn and K. Price. Differential evolution- a simple and efficient adaptive scheme for global optimization over continuous spaces. Technical report, International Computer Science Institute, 1995.Google Scholar
- R. Ugolotti, Y. S. G. Nashed, P. Mesejo, and S. Cagnoni. Algorithm configuration using GPU-based metaheuristics. In Proc. of the Genetic and Evolutionary Computation Conference companion, 2013. Google ScholarDigital Library
Index Terms
- Analysis of evolutionary algorithms using multi-objective parameter tuning
Recommendations
Genetic diversity as an objective in multi-objective evolutionary algorithms
A key feature of an efficient and reliable multi-objective evolutionary algorithm is the ability to maintain genetic diversity within a population of solutions. In this paper, we present a new diversity-preserving mechanism, the Genetic Diversity ...
An analysis on recombination in multi-objective evolutionary optimization
Evolutionary algorithms (EAs) are increasingly popular approaches to multi-objective optimization. One of their significant advantages is that they can directly optimize the Pareto front by evolving a population of solutions, where the recombination (...
Evolutionary Algorithms for Multi-Objective Optimization: Performance Assessments and Comparisons
Evolutionary techniques for multi-objective (MO) optimization are currently gaining significant attention from researchers in various fields due to their effectiveness and robustness in searching for a set of trade-off solutions. Unlike conventional methods ...
Comments