skip to main content
10.1145/1389095.1389289acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

ASAGA: an adaptive surrogate-assisted genetic algorithm

Published:12 July 2008Publication History

ABSTRACT

Genetic algorithms (GAs) used in complex optimization domains usually need to perform a large number of fitness function evaluations in order to get near-optimal solutions. In real world application domains such as the engineering design problems, such evaluations might be extremely expensive computationally. It is therefore common to estimate or approximate the fitness using certain methods. A popular method is to construct a so called surrogate or meta-model to approximate the original fitness function, which can simulate the behavior of the original fitness function but can be evaluated much faster. It is usually difficult to determine which approximate model should be used and/or what the frequency of usage should be. The answer also varies depending on the individual problem. To solve this problem, an adaptive fitness approximation GA (ASAGA) is presented. ASAGA adaptively chooses the appropriate model type; adaptively adjusts the model complexity and the frequency of model usage according to time spent and model accuracy. ASAGA also introduces a stochastic penalty function method to handle constraints. Experiments show that ASAGA outperforms non-adaptive surrogate-assisted GAs with statistical significance.

References

  1. K. Rasheed. GADO: A genetic algorithm for continuous design optimization. Technical Report DCS-TR-352, Department of Computer Science, Rutgers University, 1998. Ph.D. Thesis. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Khaled Rasheed, Haym Hirsh: Learning to be selective in genetic-algorithm-based design optimization. Artificial Intelligence in Engineering Design Analysis and Manufacturing 13(3): 157--169, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Y. Jin and J. Branke. Evolutionary optimization in uncertain environments: A survey. IEEE Transactions on Evolutionary Computation, 9(3):303--317, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Y. S. Ong, P. B. Nair, A. J. Keane, and K. W. Wong. Surrogate-Assisted Evolutionary Optimization Frameworks for High-Fidelity Engineering Design Problems. In Y. Jin, editor, Knowledge Incorporation in Evolutionary Computation, Studies in Fuzziness and Soft Computing, pages 307--332. Springer, 2004.Google ScholarGoogle Scholar
  5. Eric Sandgren. The utility of nonlinear programming algorithms. Technical report, Purdue University. Ph.D. Thesis, 1977.Google ScholarGoogle Scholar
  6. Y. Jin. A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing Journal, 9(1):3--12, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. R. Jin, W. Chen, and T.W. Simpson. Comparative studies of metamodeling techniques under miltiple modeling criteria. Technical report 2000-4801, AIAA, 2000.Google ScholarGoogle Scholar
  8. K. Sastry, D.E. Goldberg, and M. Pelikan. Don't evaluate, inherit. In Proceedings of Genetic and Evolutionary Computation Conference, pages 551--558, 2001. Morgan Kaufmann.Google ScholarGoogle Scholar
  9. N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge Press, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. LT Bui, HA Abbass, D Essam. Fitness inheritance for noisy evolutionary multi-objective optimization. Proceedings of the 2005 conference on Genetic and evolutionary computation, pages: 779--785. 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. H.-S. Kim and S.-B. Cho. An efficient genetic algorithm with less fitness evaluation by clustering. In Proceedings of IEEE Congress on Evolutionary Computation, pages 887--894, 2001. IEEE.Google ScholarGoogle Scholar
  12. K Deb. An Efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering, 2000, Elsevier.Google ScholarGoogle ScholarCross RefCross Ref
  13. Y. Jin and B. Sendhoff. Reducing fitness evaluations using clustering techniques and neural networks ensembles. In Genetic and Evolutionary Computation Conference, volume 3102 of LNCS, pages 688--699, 2004. Springer.Google ScholarGoogle ScholarCross RefCross Ref
  14. Z. Z. Zhou, Y. S. Ong, P. B. Nair, A. J. Keane and K. Y. Lum. Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization. IEEE Transactions on Systems, Man and Cybernetics - Part C, Vol. 37, No. 1, pp. 66--76. 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. T. Joachims. 11 in: Making large-Scale SVM Learning Practical. Advances in Kernel Methods - Support Vector Learning, B. Schölkopf and C. Burges and A. Smola (ed.), MIT Press, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. K. Rasheed and H. Hirsh. Informed operators: Speeding up genetic-algorithm-based design optimization using reduced models. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO'2000), pp. 628--635, 2000.Google ScholarGoogle Scholar
  17. X Llorà, K Sastry, DE Goldberg, A Gupta, L Lakshmi. Combating User Fatigue in iGAs: Partial Ordering, Support Vector Machines, and Synthetic Fitness. In Proceedings of the 2005 conference on Genetic and evolutionary computation, pages: 1363--1370. 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. William H. Press, Saul A. Teukolsky, William T. Vetterling, and Brian P. Flannery. Numerical Recipes in C: the Art of Scientific Computing. Cambridge University Press, Cambridge {England}; New York, 2nd edition, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. W. Carpenter and J.-F. Barthelemy. A comparison of polynomial approximation and artificial neural nets as response surface. Technical report 92-2247, AIAA, 1992.Google ScholarGoogle Scholar
  20. M. Hüscken, Y. Jin, and B. Sendhoff. Structure optimization of neural networks for aerodynamic optimization. Soft Computing Journal, 9(1):21--28, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Khaled Rasheed, Xiao Ni, Swaroop Vattam. "Comparison of Methods for Developing Dynamic Reduced Models for Design Optimization". Soft Computing Journal, 9(1):29--37, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. T. Simpson, T. Mauery, J. Korte, and F. Mistree. Comparison of response surface and Kriging models for multidiscilinary design optimization. Technical report 98-4755, AIAA, 1998.Google ScholarGoogle Scholar
  23. Z. Zhou, Y.S. Ong, and P.B. Nair. Hierarchical surrogate-assisted evolutionary optimization framework. In Congress on Evolutionary Computation, pages 1586--1593, 2004. IEEE.Google ScholarGoogle Scholar
  24. Y. Jin, M. Hüsken, M. Olhofer, and B. Sendhoff. Neural networks for fitness approximation in evolutionary optimization. In Y. Jin, editor,Knowledge Incorporation in Evolutionary Computation, pages 281--305. Springer, Berlin, 2004.Google ScholarGoogle Scholar
  25. D. Bueche, N.N. Schraudolph, and P. Koumoutsakos. Accelerating evolutionary algorithms with Gaussian process fitness function models. In IEEE Trans. on Systems, Man, and Cybernetics: Part C, 35(2):183--194, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. C.M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. L. Willmes, T. Baeck, Y. Jin, and B. Sendhoff. Comparing neural networks and kriging for fitness approximation in evolutionary optimization. In Proceedings of IEEE Congress on Evolutionary Computation, pages 663--670, 2003.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. ASAGA: an adaptive surrogate-assisted genetic algorithm

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              GECCO '08: Proceedings of the 10th annual conference on Genetic and evolutionary computation
              July 2008
              1814 pages
              ISBN:9781605581309
              DOI:10.1145/1389095
              • Conference Chair:
              • Conor Ryan,
              • Editor:
              • Maarten Keijzer

              Copyright © 2008 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 12 July 2008

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              Overall Acceptance Rate1,669of4,410submissions,38%

              Upcoming Conference

              GECCO '24
              Genetic and Evolutionary Computation Conference
              July 14 - 18, 2024
              Melbourne , VIC , Australia

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader