ABSTRACT
Real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations.allUse of any population based iterative technique such as evolutionary algorithmall in such problem domains is thus practically prohibitive. A feasible alternative is to build surrogates or use an approximation of the actual fitness functions to be evaluated. Naturally these surrogate or meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. This paper presents two evolutionary algorithm frameworks which involve surrogate based fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [1] reduces computation time by controlled use of meta-models (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. This does not take into account problem domains involving uncertain environment. The second model, DAFHEA-II, an enhanced version of the original DAFHEA framework, incorporates a multiple-model based learning approach for the support vector machine approximator to handle uncertain environment [2]. Empirical evaluation results have been presented based on application of the frameworks to commonly used benchmark functions.
- Bhattacharya, M. and Lu, G. DAFHEA: A Dynamic Approximate Fitness based Hybrid Evolutionary Algorithm. Proceedings of the IEEE Congress on Evolutionary Computation' 2003, Vol.3, IEEE Catalogue No. 03TH8674C, ISBN 0-7803-7805-9, pp. 1879--1886.Google Scholar
- Bhattacharya, M. Surrogate based Evolutionary Algorithm for Engineering Design Optimization. Proceedings of the Eighth International Conference on Cybernetics, Informatics and Systemic (ICCIS 2005), ISBN 975-98458-9-X, pp. 52--57.Google Scholar
- Bishop, C. Neural Networks for Pattern Recognition, Oxford Press, 1995. Google ScholarDigital Library
- Büche, D., Schraudolph, N. and Koumoutsakos, P. Accelerating Evolutionary Algorithms Using Fitness Function Models. Proc. Workshops Genetic and Evolutionary Computation Conference, Chicago, 2003.Google Scholar
- Cherkassky, V. and Ma, Y.allMultiple Model Estimation: A New Formulation for Predictive Learning. under review in IEEE Transaction on Neural Network. Google ScholarDigital Library
- Dunham, B., Fridshal, D., Fridshal, R. and North, J. Design by natural selection. Synthese, 15, pp. 254--259, 1963.Google ScholarCross Ref
- El-Beltagy, M. A. and Keane, A. J. Evolutionary optimization for computationally expensive problems using Gaussian processes. Proc. Int. Conf. on Artificial Intelligence (IC-AI'2001), CSREA Press, Las Vegas, pp. 708--714, 2001.Google Scholar
- Gunn, S. R. Support Vector Machines for Classification and Regression. Technical Report, School of Electronics and Computer Science, University of Southampton, (Southampton, U.K.), 1998.Google Scholar
- Hajela, P. and Lee, A. Topological optimization of rotorcraft subfloor structures for crashworthiness considerations. Computers and Structures, vol.64, pp. 65--76, 1997.Google ScholarCross Ref
- Hastie, T., Tibshirani, R., Friedman, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer Series in Statistics, ISBN 0-387-95284-5.Google Scholar
- Jin, Y. A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing Journal, 9(1), Springer, pp. 3--12, 2005. Google ScholarDigital Library
- Jin, Y., Olhofer, M. and Sendhoff, B. A Framework for Evolutionary Optimization with Approximate Fitness Functions. IEEE Transactions on Evolutionary Computation, 6(5), pp. 481--494, (ISSN: 1089-778X). 2002. Google ScholarDigital Library
- Jin, Y., Olhofer, M. and Sendhoff, B. On Evolutionary Optimisation with Approximate Fitness Functions. Proceedings of the Genetic and Evolutionary Computation Conference GECCO, Las Vegas, Nevada, USA. pp. 786--793, July 10-12, 2000.Google Scholar
- Kim, H. S. and Cho, S. B. An efficient genetic algorithm with less fitness evaluation by clustering. Proceedings of IEEE Congress on Evolutionary Computation, pp. 887--894, 2001.Google Scholar
- Myers, R. and Montgomery, D. Response Surface Methodology. John Wiley & Sons, 1985.Google Scholar
- Pierret, S. Three-dimensional blade design by means of an artificial neural network and Navier-Stokes solver. Proceedings of Fifth Conference on Parallel Problem Solving from Nature, Amsterdam, 1999.Google Scholar
- Rasheed, K. An Incremental-Approximate-Clustering Approach for Developing Dynamic Reduced Models for Design Optimization. Proceedings of IEEE Congress on Evolutionary Computation, 2000.Google Scholar
- Rasheed, K., Vattam, S. and Ni., X. Comparison of Methods for Using Reduced Models to Speed Up Design Optimization. The Genetic and Evolutionary Computation Conference (GECCO'2002), 2002. Google ScholarDigital Library
- Ratle, A. Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. Parallel Problem Solving from Nature-PPSN V, Springer-Verlag, pp. 87--96, 1998. Google ScholarDigital Library
- Sacks, J., Welch, W., Mitchell, T. and Wynn, H. Design and analysis of computer experiments. Statistical Science, 4(4), 1989.Google Scholar
- Schölkopf , B., Burges, J. and Smola, A.alled. Advances in Kernel Methods: Support Vector Machines, MIT Press, 1999.Google Scholar
- Smola, A. and Schölkopf, B. A Tutorial on Support Vector Regression. NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, UK, 1998.Google Scholar
- Torczon, V. and Trosset, M. W. Using approximations to accelerate engineering design optimisation. ICASE Report No. 98--33. Technical report, NASA Langley Research Center Hampton, VA 23681-2199, 1998. Google ScholarDigital Library
- Toropov, V., Filatov, A. and Polykin, A. Multiparameter structural optimization using FEM and multipoint explicit approximations. Structural Optimization, vol. 6, pp. 7--14, 1993.Google ScholarCross Ref
- Vapnik, V. The Nature of Statistical Learning Theory. Springer-Verlag, NY, USA, 1999. Google ScholarDigital Library
- Vekeria, H. D. and Parmee, I. C. The use of a co-operative multi-level CHC GA for structural shape optimization. Fourth European Congress on Intelligent Techniques and Soft Computing -- EUFIT'96, 1996.Google Scholar
- Won, K., Roy, T. and Tai, K. A Framework for Optimization Using Approximate Functions. Proceedings of the IEEE Congress on Evolutionary Computation' 2003, Vol.3, IEEE Catalogue No. 03TH8674C, ISBN 0-7803-7805-9.Google Scholar
Index Terms
- Expensive optimization, uncertain environment: an EA-based solution
Recommendations
A synergistic approach for evolutionary optimization
GECCO '08: Proceedings of the 10th annual conference companion on Genetic and evolutionary computationOne of the major causes of premature convergence in Evolutionary Algorithm (EA) is loss of population diversity, which pushes the search space to a homogeneous or a near-homogeneous configuration. In particular, this can be a more complicated issue in ...
Using holey fitness landscapes to counteract premature convergence in evolutionary algorithms
GECCO '08: Proceedings of the 10th annual conference companion on Genetic and evolutionary computationPremature convergence is a persisting problem in evolutionary optimisation, in particular - genetic algorithms. While a number of methods exist to approach this issue, they usually require problem specific calibration or only partially resolve the issue,...
Steady-state ALPS for real-valued problems
GECCO '09: Proceedings of the 11th Annual conference on Genetic and evolutionary computationThe objectives of this paper are to describe a steady-state version of the Age-Layered Population Structure (ALPS) Evolutionary Algorithm (EA) and to compare it against other GAs on real-valued problems. Motivation for this work comes from our previous ...
Comments