skip to main content
10.1145/2001858.2001901acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

Efficient EDA for large opimization problems via constraining the search space of models

Published:12 July 2011Publication History

ABSTRACT

Introducing efficient Bayesian learning algorithms in Bayesian network based EDAs seems necessary in order to use them for large problems. In this paper we propose an algorithm, called CMSS-BOA, which uses a recently introduced heuristic called max-min parent children (MMPC) [3] in order to constraint the models search space. This algorithm does not consider a fix and small upper bound on the order of interaction between variables and is able solve problems with large number of variables efficiently. We compare the efficiency of CMSS-BOA with standard Bayesian network based EDA for solving several benchmark problems.

References

  1. Larranaga, P. and Lozano, J. A. Estimation of Distribution Algorithms. Kluwer Academic publisher, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Pelikan, M. Bayesian optimization algorithm: from single level to hierarchy, Ph.D. Thesis, University of Illinois, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Tsamardinos, I., Brown, L. E., Aliferis, C. F. The MMPC hill-climbing Bayesian network structure learning algorithm, Machine Learning Journal, 65(1):31--78, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Efficient EDA for large opimization problems via constraining the search space of models

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader