skip to main content
10.1145/1274000.1274003acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
Article

Expensive optimization, uncertain environment: an EA-based solution

Published:07 July 2007Publication History

ABSTRACT

Real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations.allUse of any population based iterative technique such as evolutionary algorithmall in such problem domains is thus practically prohibitive. A feasible alternative is to build surrogates or use an approximation of the actual fitness functions to be evaluated. Naturally these surrogate or meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. This paper presents two evolutionary algorithm frameworks which involve surrogate based fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [1] reduces computation time by controlled use of meta-models (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. This does not take into account problem domains involving uncertain environment. The second model, DAFHEA-II, an enhanced version of the original DAFHEA framework, incorporates a multiple-model based learning approach for the support vector machine approximator to handle uncertain environment [2]. Empirical evaluation results have been presented based on application of the frameworks to commonly used benchmark functions.

References

  1. Bhattacharya, M. and Lu, G. DAFHEA: A Dynamic Approximate Fitness based Hybrid Evolutionary Algorithm. Proceedings of the IEEE Congress on Evolutionary Computation' 2003, Vol.3, IEEE Catalogue No. 03TH8674C, ISBN 0-7803-7805-9, pp. 1879--1886.Google ScholarGoogle Scholar
  2. Bhattacharya, M. Surrogate based Evolutionary Algorithm for Engineering Design Optimization. Proceedings of the Eighth International Conference on Cybernetics, Informatics and Systemic (ICCIS 2005), ISBN 975-98458-9-X, pp. 52--57.Google ScholarGoogle Scholar
  3. Bishop, C. Neural Networks for Pattern Recognition, Oxford Press, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Büche, D., Schraudolph, N. and Koumoutsakos, P. Accelerating Evolutionary Algorithms Using Fitness Function Models. Proc. Workshops Genetic and Evolutionary Computation Conference, Chicago, 2003.Google ScholarGoogle Scholar
  5. Cherkassky, V. and Ma, Y.allMultiple Model Estimation: A New Formulation for Predictive Learning. under review in IEEE Transaction on Neural Network. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Dunham, B., Fridshal, D., Fridshal, R. and North, J. Design by natural selection. Synthese, 15, pp. 254--259, 1963.Google ScholarGoogle ScholarCross RefCross Ref
  7. El-Beltagy, M. A. and Keane, A. J. Evolutionary optimization for computationally expensive problems using Gaussian processes. Proc. Int. Conf. on Artificial Intelligence (IC-AI'2001), CSREA Press, Las Vegas, pp. 708--714, 2001.Google ScholarGoogle Scholar
  8. Gunn, S. R. Support Vector Machines for Classification and Regression. Technical Report, School of Electronics and Computer Science, University of Southampton, (Southampton, U.K.), 1998.Google ScholarGoogle Scholar
  9. Hajela, P. and Lee, A. Topological optimization of rotorcraft subfloor structures for crashworthiness considerations. Computers and Structures, vol.64, pp. 65--76, 1997.Google ScholarGoogle ScholarCross RefCross Ref
  10. Hastie, T., Tibshirani, R., Friedman, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer Series in Statistics, ISBN 0-387-95284-5.Google ScholarGoogle Scholar
  11. Jin, Y. A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing Journal, 9(1), Springer, pp. 3--12, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Jin, Y., Olhofer, M. and Sendhoff, B. A Framework for Evolutionary Optimization with Approximate Fitness Functions. IEEE Transactions on Evolutionary Computation, 6(5), pp. 481--494, (ISSN: 1089-778X). 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Jin, Y., Olhofer, M. and Sendhoff, B. On Evolutionary Optimisation with Approximate Fitness Functions. Proceedings of the Genetic and Evolutionary Computation Conference GECCO, Las Vegas, Nevada, USA. pp. 786--793, July 10-12, 2000.Google ScholarGoogle Scholar
  14. Kim, H. S. and Cho, S. B. An efficient genetic algorithm with less fitness evaluation by clustering. Proceedings of IEEE Congress on Evolutionary Computation, pp. 887--894, 2001.Google ScholarGoogle Scholar
  15. Myers, R. and Montgomery, D. Response Surface Methodology. John Wiley & Sons, 1985.Google ScholarGoogle Scholar
  16. Pierret, S. Three-dimensional blade design by means of an artificial neural network and Navier-Stokes solver. Proceedings of Fifth Conference on Parallel Problem Solving from Nature, Amsterdam, 1999.Google ScholarGoogle Scholar
  17. Rasheed, K. An Incremental-Approximate-Clustering Approach for Developing Dynamic Reduced Models for Design Optimization. Proceedings of IEEE Congress on Evolutionary Computation, 2000.Google ScholarGoogle Scholar
  18. Rasheed, K., Vattam, S. and Ni., X. Comparison of Methods for Using Reduced Models to Speed Up Design Optimization. The Genetic and Evolutionary Computation Conference (GECCO'2002), 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Ratle, A. Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. Parallel Problem Solving from Nature-PPSN V, Springer-Verlag, pp. 87--96, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Sacks, J., Welch, W., Mitchell, T. and Wynn, H. Design and analysis of computer experiments. Statistical Science, 4(4), 1989.Google ScholarGoogle Scholar
  21. Schölkopf , B., Burges, J. and Smola, A.alled. Advances in Kernel Methods: Support Vector Machines, MIT Press, 1999.Google ScholarGoogle Scholar
  22. Smola, A. and Schölkopf, B. A Tutorial on Support Vector Regression. NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, UK, 1998.Google ScholarGoogle Scholar
  23. Torczon, V. and Trosset, M. W. Using approximations to accelerate engineering design optimisation. ICASE Report No. 98--33. Technical report, NASA Langley Research Center Hampton, VA 23681-2199, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Toropov, V., Filatov, A. and Polykin, A. Multiparameter structural optimization using FEM and multipoint explicit approximations. Structural Optimization, vol. 6, pp. 7--14, 1993.Google ScholarGoogle ScholarCross RefCross Ref
  25. Vapnik, V. The Nature of Statistical Learning Theory. Springer-Verlag, NY, USA, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Vekeria, H. D. and Parmee, I. C. The use of a co-operative multi-level CHC GA for structural shape optimization. Fourth European Congress on Intelligent Techniques and Soft Computing -- EUFIT'96, 1996.Google ScholarGoogle Scholar
  27. Won, K., Roy, T. and Tai, K. A Framework for Optimization Using Approximate Functions. Proceedings of the IEEE Congress on Evolutionary Computation' 2003, Vol.3, IEEE Catalogue No. 03TH8674C, ISBN 0-7803-7805-9.Google ScholarGoogle Scholar

Index Terms

  1. Expensive optimization, uncertain environment: an EA-based solution

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '07: Proceedings of the 9th annual conference companion on Genetic and evolutionary computation
      July 2007
      1450 pages
      ISBN:9781595936981
      DOI:10.1145/1274000

      Copyright © 2007 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 July 2007

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader