Abstract
Parameter setting for evolutionary algorithms is still an important issue in evolutionary computation. There are two main approaches to parameter setting: parameter tuning and parameter control. In this paper, we introduce self-adaptive parameter control of a genetic algorithm based on Bayesian network learning and simulation. The nodes of this Bayesian network are genetic algorithm parameters to be controlled. Its structure captures probabilistic conditional (in)dependence relationships between the parameters. They are learned from the best individuals, i.e., the best configurations of the genetic algorithm. Individuals are evaluated by running the genetic algorithm for the respective parameter configuration. Since all these runs are time-consuming tasks, each genetic algorithm uses a small-sized population and is stopped before convergence. In this way promising individuals should not be lost. Experiments with an optimal search problem for simultaneous row and column orderings yield the same optima as state-of-the-art methods but with a sharp reduction in computational time. Moreover, our approach can cope with as yet unsolved high-dimensional problems.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
De Jong K A. An analysis of behavior of a class of genetic adaptive systems [Ph.D. Thesis]. University of Michigan, USA, 1975.
Grefenstette JJ (1986) Optimization of control parameters for genetic algorithms. IEEE Transactions on Systems, Man and Cybernetics 16(1):122–128
Wolpert D, Macready W (1997) No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation 1(1):67–82
Eiben A E, Michalewicz Z, Schoenauer M, Smith J E. Parameter control in evolutionary algorithms. In Studies in Computational Intelligence 54, Lobo F G, Lina C F, Michalewicz Z et al. (eds.), Springer, 2007, pp.19–46.
de Lima E B, Pappa G L, de Almeida J M et al. Tuning genetic programming parameters with factorial designs. In Proc. 2010 IEEE Congress on Evolutionary Computation, July 2010.
Rojas I, González J, Pomares H et al. Statistical analysis of the main parameters involved in the design of a genetic algorithm. IEEE Transactions on Systems, Man, and Cybernetics — Part C, Applications and Reviews, 2002, 32(1): 31–37.
Czarn A, MacNish C, Vijayan K et al (2004) Statistical exploratory analysis of genetic algorithms: The importance of interaction. IEEE Trans Evolutionary Computation 8(4):405–421
Smit S K, Eiben A E. Parameter tuning of evolutionary algorithms: Generalist vs. specialist. In Lecture Notes in Computer Science 6024, Di Chio C, Cagnoni S, Cotta C et al. (eds.), Springer, 2010, pp.542–551.
Rechenberg I. Evolutionsstrategie: Optimierung Technischer Systeme nach Prinzipien der Biologischen Evolution. Stuttgart, Germany: Frommann-Holzboog, 1973. (In German)
Santana R, Larrañaga P, Lozano J A. Adaptive estimation of distribution algorithms. In Studies in Computational Intelligence 136, Cotta C, Sevaux M, Sörensen K et al. (eds.), Springer, 2008, pp.177–197.
Kramer O. Self-Adaptive Heuristics for Evolutionary Computation. Berlin, Germany: Springer-Verlag, 2008.
Angeline P J. Adaptive and self-adaptive evolutionary computation. In Computational Intelligence: A Dynamic System Perspective, Palaniswami Y, Attikiouzel R, Marks R et al. (eds.), IEEE, 1995, pp.152–161.
Hinterding R, Michalewicz Z, Eiben A E. Adaptation in evolutionary computation: A survey. In Proc. the 4th IEEE Conf. Evolutionary Computation, Apr. 1997, pp.65–69.
Smith J. Self adaptation in evolutionary algorithms [Ph.D. Thesis]. University of the West of England, UK, 1997.
Friesleben B, Hartfelder M. Optimisation of genetic algorithms by genetic algorithms. In Proc. Artificial Neural Networks and Genetic Algorithms, Apr. 1993, pp.392–399.
Bäck T. Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms. New York, USA: Oxford University Press, 1996.
Rechenberg I (1994) Evolutionsstrategie'94. Frommann-Holzboog, Stuttgart, Germany
Larrañaga P (2002) Lozano J A. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation. Kluwer Academic Publishers, New York, USA
Peña J M, Robles V, Larrañaga P et al. GA-EDA: Hybrid evolutionary algorithm using genetic and estimation of distribution algorithms. In Proc. the 17th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, May 2004, pp.361–371.
Santana R, Larrañaga P, Lozano JA (2008) Combining variable neighborhood search and estimation of distribution algorithms in the protein side chain placement problem. Journal of Heuristics 14(5):519–547
Sun J, Zhang Q, Tsang E (2005) DE/EDA: A new evolutionary algorithm for global optimisation. Information Sciences 169(3/4):249–262
Dong W, Yao X. NichingEDA: Utilizing the diversity inside a population of EDAs for continuous optimization. In Proc. the 2008 IEEE Congress on Evolutionary Computation, June 2008, pp.1260–1267.
Nannen V, Smit S K, Eiben A E. Costs and benefits of tuning parameters of evolutionary algorithms. In Proc. the 10th Int. Conf. Parallel Problem Solving from Nature, Sept. 2008, Vol.5199, pp.528–538.
Bielza C, Fernández del Pozo JA, Larrañaga P et al (2010) Multidimensional statistical analysis of the parameterization of a genetic algorithm for the optimal ordering of tables. Expert Systems with Applications 37(1):804–815
Etxeberria R, Larrañaga P. Global optimization using Bayesian networks. In Proc. the 2nd Symposium on Artificial Intelligence, March 1999, pp.332–339.
Pelikan M, Goldberg D E, Cantú-Paz E. BOA: The Bayesian optimization algorithm. In Proc. the Genetic and Evolutionary Computation Conference, July 1999, pp.525–532.
Yuan B, Gallagher M. Combining meta-EAs and racing for difficult EA parameter tuning tasks. In Studies in Computational Intelligence 54, Lobo F J, Lima C F, Michalewicz Z (eds.), Springer, 2007, pp.121–142.
Lobo F G, Lima C F. Adaptive population sizing schemes in genetic algorithms. In Studies in Computational Intelligence 54, Lobo F, Lima C F, Michalewicz Z (eds.), Springer, 2007, pp.185–204.
Friedman N, Yakhini Z. On the sample complexity of learning Bayesian networks. In Proc. the 12th Conference on Uncertainty in Artificial Intelligence, Aug. 1996, pp.274–282.
Lam W, Bacchus F (1994) Learning Bayesian belief networks: An approach based on the MDL principle. Computational Intelligence 10(3):269–293
Henrion M. Propagating uncertainty in Bayesian networks by probabilistic logic sampling. In Proc. the 2nd Annual Conf. Uncertainty in Artificial Intelligence, Aug. 1986, pp.149–164.
Bertin J (1981) Graphics and Graphic Information Processing. Walter de Gruyter & Co., UK
Johnson D S, Papadimitriou C H. Computational complexity. In The Traveling Salesman Problem, Lawler E L, Lenstra J K, Rinnooy Kan A et al. (eds.), John Wiley & Sons, 1985, pp.37–85.
Niermann S (2005) Optimizing the ordering of tables with evolutionary computation. The American Statistician 59(1):41–46
Larrañaga P, Kuijpers CMH, Murga RH et al (1999) Genetic algorithms for the travelling salesman problem: A review of representations and operators. Artificial Intelligence Review 13(2):129–170
Croes GA (1958) A method for solving traveling-salesman problems. Operations Research 6(6):791–812
Pearl J (1988) Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, San Francisco, USA
Author information
Authors and Affiliations
Corresponding author
Additional information
This work has been partially supported by the Spanish Ministry of Economy and Competitiveness under Grant No. TIN2010-20900-C04-04 and Cajal Blue Brain.
Electronic Supplementary Material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Bielza, C., Fernández del Pozo, J.A. & Larrañaga, P. Parameter Control of Genetic Algorithms by Learning and Simulation of Bayesian Networks — A Case Study for the Optimal Ordering of Tables. J. Comput. Sci. Technol. 28, 720–731 (2013). https://doi.org/10.1007/s11390-013-1370-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11390-013-1370-0