ABSTRACT
Genetic algorithms (GAs) used in complex optimization domains usually need to perform a large number of fitness function evaluations in order to get near-optimal solutions. In real world application domains such as the engineering design problems, such evaluations might be extremely expensive computationally. It is therefore common to estimate or approximate the fitness using certain methods. A popular method is to construct a so called surrogate or meta-model to approximate the original fitness function, which can simulate the behavior of the original fitness function but can be evaluated much faster. It is usually difficult to determine which approximate model should be used and/or what the frequency of usage should be. The answer also varies depending on the individual problem. To solve this problem, an adaptive fitness approximation GA (ASAGA) is presented. ASAGA adaptively chooses the appropriate model type; adaptively adjusts the model complexity and the frequency of model usage according to time spent and model accuracy. ASAGA also introduces a stochastic penalty function method to handle constraints. Experiments show that ASAGA outperforms non-adaptive surrogate-assisted GAs with statistical significance.
- K. Rasheed. GADO: A genetic algorithm for continuous design optimization. Technical Report DCS-TR-352, Department of Computer Science, Rutgers University, 1998. Ph.D. Thesis. Google ScholarDigital Library
- Khaled Rasheed, Haym Hirsh: Learning to be selective in genetic-algorithm-based design optimization. Artificial Intelligence in Engineering Design Analysis and Manufacturing 13(3): 157--169, 1999. Google ScholarDigital Library
- Y. Jin and J. Branke. Evolutionary optimization in uncertain environments: A survey. IEEE Transactions on Evolutionary Computation, 9(3):303--317, 2005. Google ScholarDigital Library
- Y. S. Ong, P. B. Nair, A. J. Keane, and K. W. Wong. Surrogate-Assisted Evolutionary Optimization Frameworks for High-Fidelity Engineering Design Problems. In Y. Jin, editor, Knowledge Incorporation in Evolutionary Computation, Studies in Fuzziness and Soft Computing, pages 307--332. Springer, 2004.Google Scholar
- Eric Sandgren. The utility of nonlinear programming algorithms. Technical report, Purdue University. Ph.D. Thesis, 1977.Google Scholar
- Y. Jin. A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing Journal, 9(1):3--12, 2005. Google ScholarDigital Library
- R. Jin, W. Chen, and T.W. Simpson. Comparative studies of metamodeling techniques under miltiple modeling criteria. Technical report 2000-4801, AIAA, 2000.Google Scholar
- K. Sastry, D.E. Goldberg, and M. Pelikan. Don't evaluate, inherit. In Proceedings of Genetic and Evolutionary Computation Conference, pages 551--558, 2001. Morgan Kaufmann.Google Scholar
- N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge Press, 2000. Google ScholarDigital Library
- LT Bui, HA Abbass, D Essam. Fitness inheritance for noisy evolutionary multi-objective optimization. Proceedings of the 2005 conference on Genetic and evolutionary computation, pages: 779--785. 2005. Google ScholarDigital Library
- H.-S. Kim and S.-B. Cho. An efficient genetic algorithm with less fitness evaluation by clustering. In Proceedings of IEEE Congress on Evolutionary Computation, pages 887--894, 2001. IEEE.Google Scholar
- K Deb. An Efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering, 2000, Elsevier.Google ScholarCross Ref
- Y. Jin and B. Sendhoff. Reducing fitness evaluations using clustering techniques and neural networks ensembles. In Genetic and Evolutionary Computation Conference, volume 3102 of LNCS, pages 688--699, 2004. Springer.Google ScholarCross Ref
- Z. Z. Zhou, Y. S. Ong, P. B. Nair, A. J. Keane and K. Y. Lum. Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization. IEEE Transactions on Systems, Man and Cybernetics - Part C, Vol. 37, No. 1, pp. 66--76. 2007. Google ScholarDigital Library
- T. Joachims. 11 in: Making large-Scale SVM Learning Practical. Advances in Kernel Methods - Support Vector Learning, B. Schölkopf and C. Burges and A. Smola (ed.), MIT Press, 1999. Google ScholarDigital Library
- K. Rasheed and H. Hirsh. Informed operators: Speeding up genetic-algorithm-based design optimization using reduced models. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO'2000), pp. 628--635, 2000.Google Scholar
- X Llorà, K Sastry, DE Goldberg, A Gupta, L Lakshmi. Combating User Fatigue in iGAs: Partial Ordering, Support Vector Machines, and Synthetic Fitness. In Proceedings of the 2005 conference on Genetic and evolutionary computation, pages: 1363--1370. 2005. Google ScholarDigital Library
- William H. Press, Saul A. Teukolsky, William T. Vetterling, and Brian P. Flannery. Numerical Recipes in C: the Art of Scientific Computing. Cambridge University Press, Cambridge {England}; New York, 2nd edition, 1992. Google ScholarDigital Library
- W. Carpenter and J.-F. Barthelemy. A comparison of polynomial approximation and artificial neural nets as response surface. Technical report 92-2247, AIAA, 1992.Google Scholar
- M. Hüscken, Y. Jin, and B. Sendhoff. Structure optimization of neural networks for aerodynamic optimization. Soft Computing Journal, 9(1):21--28, 2005. Google ScholarDigital Library
- Khaled Rasheed, Xiao Ni, Swaroop Vattam. "Comparison of Methods for Developing Dynamic Reduced Models for Design Optimization". Soft Computing Journal, 9(1):29--37, 2005. Google ScholarDigital Library
- T. Simpson, T. Mauery, J. Korte, and F. Mistree. Comparison of response surface and Kriging models for multidiscilinary design optimization. Technical report 98-4755, AIAA, 1998.Google Scholar
- Z. Zhou, Y.S. Ong, and P.B. Nair. Hierarchical surrogate-assisted evolutionary optimization framework. In Congress on Evolutionary Computation, pages 1586--1593, 2004. IEEE.Google Scholar
- Y. Jin, M. Hüsken, M. Olhofer, and B. Sendhoff. Neural networks for fitness approximation in evolutionary optimization. In Y. Jin, editor,Knowledge Incorporation in Evolutionary Computation, pages 281--305. Springer, Berlin, 2004.Google Scholar
- D. Bueche, N.N. Schraudolph, and P. Koumoutsakos. Accelerating evolutionary algorithms with Gaussian process fitness function models. In IEEE Trans. on Systems, Man, and Cybernetics: Part C, 35(2):183--194, 2005. Google ScholarDigital Library
- C.M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995. Google ScholarDigital Library
- L. Willmes, T. Baeck, Y. Jin, and B. Sendhoff. Comparing neural networks and kriging for fitness approximation in evolutionary optimization. In Proceedings of IEEE Congress on Evolutionary Computation, pages 663--670, 2003.Google ScholarCross Ref
Index Terms
- ASAGA: an adaptive surrogate-assisted genetic algorithm
Recommendations
Sub-structural niching in estimation of distribution algorithms
GECCO '05: Proceedings of the 7th annual conference on Genetic and evolutionary computationWe propose a sub-structural niching method that fully exploits the problem decomposition capability of linkage-learning methods such as the estimation distribution algorithms and concentrate on maintaining diversity at the sub-structural level. The ...
An application of a GA with Markov network surrogate to feature selection
Surrogate models of fitness have been presented as a way of reducing the number of fitness evaluations required by evolutionary algorithms. This is of particular interest with expensive fitness functions where the time taken for building the model is ...
An improved class of real-coded Genetic Algorithms for numerical optimization
Over the last few decades, many improved Evolutionary Algorithms (EAs) have been proposed to tackle different types of optimization problems. Genetic Algorithm (GA) among other canonical algorithms have not shown consistent performance over a range of ...
Comments