ABSTRACT
In evolutionary algorithms, the efficiency enhancement techniques are capable of solving difficult large scale problems in a scalable manner. This paper rigorously analyzes the Bayesian optimization algorithm (BOA) incorporated with an innovative evaluation relaxation method based on the entropy measurement theory (en-BOA). In particular, the concept of entropy is used to develop the evaluation relaxation strategy (ERS) and to determine the rate of convergence. Entropy measurement-based ERS is employed to recognize which candidate solution should be evaluated by the actual function or be estimated by the surrogate model. Experiments prove that en-BOA significantly reduces the number of actual evaluations and the scalability of BOA is not negatively affected. Moreover, the entropy measurement-based evaluation relaxation technique does not require any larger population sizes.
- C. W. Ahn and R. S. Ramakrishna. On the scalability of real-coded bayesian optimization algorithm. IEEE Trans. Evolutionary Computation, 12(3):307--322, 2008. Google ScholarDigital Library
- C. W. Ahn, R. S. Ramakrishna, and D. E. Goldberg. Real-coded bayesian optimization algorithm: Bringing the strength of boa into the continuous world. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2004), pages 840--851, 2004.Google ScholarCross Ref
- J. F. M. Barthelemy and R. T. Haftka. Approximation concepts for optimum structural design: a review. Structural and Multidisciplinary Optimization, 5:129--144, 1993.Google ScholarCross Ref
- P. A. Bosman, D. Thierens, P. A. N, and B. D. Thierens. Linkage information processing in distribution estimation algorithms, 1999.Google Scholar
- D. Chickering, D. Geiger, and D. Heckerman. Learning bayesian networks is np-hard. Technical report, 1994.Google Scholar
- T. S. P. C. Duque, D. E. Goldberg, and K. Sastry. Enhancing the efficiency of the ecga. In PPSN X, LNCS 5199, pages 165--174, 2008.Google Scholar
- G. P. Gladyshev. Thermodynamic theory of biological evolution and aging. experimental confirmation of theory. Entropy, 1(4):55--68, 1999.Google ScholarCross Ref
- D. E. Goldberg. Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Reading, MA, 1989. Google ScholarDigital Library
- D. E. Goldberg and M. Rudnick. Genetic algorithms and the variance of fitness, 1991.Google Scholar
- G. Harik and G. Harik. Linkage learning via probabilistic modeling in the ecga. Technical report, 1999.Google Scholar
- D. Heckerman. A tutorial on learning bayesian networks. Technical report, Communications of the ACM, 1995.Google Scholar
- D. Heckerman, D. Geiger, and D. M. Chickering. Learning bayesian networks: The combination of knowledge and statistical data. In MACHINE LEARNING, pages 197--243, 1995. Google ScholarDigital Library
- Y. Jin. A comprehensive survey of fitness approximation in evolutionary computation, 2003. Google ScholarDigital Library
- P. Larrañga and J. A. Lozano, editors. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation. Kluwer Academic Publishers, Boston, MA, 2002.Google ScholarCross Ref
- A. Ochoa and M. R. Soto. Linking entropy to estimation of distribution algorithms. In J. A. Lozano, P. Larrañaga, I. Inza, and E. Bengoetxea, editors, Towards a New Evolutionary Computation: Advances on Estimation of Distribution Algorithms, pages 1--38. Springer, 2006.Google Scholar
- M. Pelikan and D. E. Goldberg. Hierarchical bayesian optimization algorithm = bayesian optimization algorithm + niching + local structures. pages 525--532. Morgan Kaufmann, 2001.Google Scholar
- M. Pelikan, D. E. Goldberg, and E. Cantú-Paz. Boa: The bayesian optimization algorithm. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-1999), pages 525--532, 1999.Google Scholar
- M. Pelikan and H. Mühlenbein. The bivariate marginal distribution algorithm, 1999.Google Scholar
- M. Pelikan and H. Mühlenbein. Marginal distributions in evolutionary algorithms. In In Proceedings of the International Conference on Genetic Algorithms Mendel '98, pages 90--95, 1999.Google Scholar
- M. Pelikan and K. Sastry. Fitness inheritance in the bayesian optimization algorithm. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2004), pages 48--59, 2004.Google ScholarCross Ref
- K. Sastry, D. E. Goldberg, and M. Pelikan. Efficiency enhancement of probabilistic model building algorithms. In In Proceedings of the Optimization by Building and Using Probabilistic Models Workshop at the Genetic and Evolutionary Computation Conference, 2004.Google Scholar
- K. Sastry, C. F. Lima, and D. E. Goldberg. Evaluation relaxation using substructural information and linear estimation. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2006), pages 419--426, 2006. Google ScholarDigital Library
- K. Sastry, M. Pelikan, and D. E. Goldberg. Efficiency enhancement of estimation of distribution algorithms. In M. Pelikan, K. Sastry, and E. Cantú-Paz, editors, Scalable Optimization via Probabilistic Modeling: From Algorithms to Applications, Studies in Computational Intelligence, pages 161--186. Springer, 2006.Google Scholar
- R. E. Smith, B. A. Dike, and S. A. Stegmann. Fitness inheritance in genetic algorithms. In SAC '95: Proceedings of the 1995 ACM symposium on Applied computing, pages 345--350, New York, NY, USA, 1995. ACM. Google ScholarDigital Library
Index Terms
- Entropy measurement-based estimation model for bayesian optimization algorithm
Recommendations
Entropy-based evaluation relaxation strategy for Bayesian optimization algorithm
IEA/AIE'10: Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part IIBayesian Optimization Algorithm (BOA) belongs to the advanced evolutionary algorithms (EA) capable of solving problems with multivariate interactions. However, to attain wide applicability in realworld optimization, BOA needs to be coupled with various ...
Entropy-based efficiency enhancement techniques for evolutionary algorithms
This paper introduces the notion of an entropy measurement for populations of candidate solutions in evolutionary algorithms, developing both conditional and joint entropy-based algorithms. We describe the inherent characteristics of the entropy ...
Entropy-based substructural local search for the bayesian optimization algorithm
GECCO '10: Proceedings of the 12th annual conference on Genetic and evolutionary computationA customary paradigm of designing a competent optimization algorithm is to combine an effective global searcher with an efficient local searcher. This paper presents and analyzes an entropy-based substructural local search method (eSLS) for the Bayesian ...
Comments