Abstract
We develop an efficient differential evolution (DE) with neural networks-based approximating technique for computationally expensive problems, called DE-ANN hereinafter. We employ multilayer feedforward ANN to approximate the original problems for reducing the numbers of costly problems in DE. We also implement a fast training algorithm whose data samples use the population of DE. In the evolution process of DE, we combine the individual-based and generation-based methods for approximate model control. We compared the proposed algorithm with the conventional DE on three benchmark test functions. The experimental results showed that DE-ANN had capacity to be employed to deal with the computationally demanding real-world problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Jin, Y.: A Comprehensive Survey of Fitness Approximation in Evolutionary Computation. Soft Computing 9(1), 3–12 (2005)
Shi, L., Rasheed, K.: A Survey of Fitness Approximation Methods Applied in Evolutionary Algorithms. In: Tenne, Y., Goh, C.-K. (eds.) Computational Intel. in Expensive Opti. Prob., ALO, vol. 2, pp. 3–28. Springer, Heidelberg (2010)
Pelikan, M., Sastry, K.: Fitness Inheritance in the Bayesian Optimization Algorithm. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3103, pp. 48–59. Springer, Heidelberg (2004)
Ong, Y.S., Nair, P.B., Keane, A.J., Wong, K.W.: Surrogate-Assisted Evolutionary Optimization Frameworks for High-Fidelity Engineering Design Problems. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation. Studies in Fuzziness and Soft Computing, pp. 307–332. Springer, Heidelberg (2004)
Takagi, H.: Interactive evolutionary computation. Proceedings of the IEEE, Fusion of the capabilities of EC optimization and human evaluation 89(9), 1275–1296 (2001)
Jin, Y., Sendhoff, B.: Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3102, pp. 688–699. Springer, Heidelberg (2004)
Jin, Y., Hüsken, M., Olhofer, M., Sendhoff, B.: Neural networks for fitness approximation in evolutionary optimization. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 281–305. Springer, Berlin (2004)
Goel, T., Vaidyanathan, R., Haftka, R.T., et al.: Response surface approximation of Pareto optimal front in multi-objective optimization. Computer Methods in Applied Mechanics and Engineering 196(4-6), 879–893 (2007)
Chung, H.-S., Alonso, J.J.: Multi-objective optimization using approximation model based genetic algorithms. Technical report 2004-4325, AIAA (2004)
Llora, X., Sastry, K.: Goldberg, et al.: Combating User Fatigue in iGAs: Partial Ordering, Support Vector Machines, and Synthetic Fitness. In: Proceedings of the 2005 Conference on Genetic and Evolutionary Computation, pp. 1363–1370 (2005)
Jin, Y., Sendhoff, B.: A Framework for Evolutionary Optimization with Approximate Fitness Functions. IEEE Transactions on Evolutionary Computation 6(5), 481–494 (2002)
Rommel Regis, G., Christine Shoemaker, A.: Local Function Approximation in Evolutionary Algorithm for the optimization of Costly Functions. IEEE Transactions on Evolutionary Computation 8(5), 490–505 (2004)
Zhou, Z.Z., Ong, Y.S., Lim, M.H., et al.: Memetic algorithm using multi-surrogates for computationally expensive optimization problems. Soft Comput. 11, 957–971 (2007)
Storn, R.: System Design by Constraint Adaptation and Differential Evolution. IEEE Transactions on Evolutionary Computation 3(1), 22–34 (1999)
Madavan, N.: Aerodynamic Shape Optimization Using Hybridized Differential Evolution. In: 21st AIAA Applied Aerodynamics Conference, Orlando, Florida, June 23-26. NASA Ames Research Center, Moffett Field, CA, AIAA-2003-3792 (2003)
Pawan, K.S., Nain, D.K.: Computationally effective search and optimization procedure using coarse-to-fine approximation. In: Proceedings of Congress on evolutionary computation (CEC 2003), Canberra, Australia, pp. 2081–2088 (2003)
Hassoun, M.H.: Fundamentals of Artificial Neural Networks. The MIT Press, Cambridge (1995)
Riedmiller, M., Braun, H.: A Direct Adaptive Method for Faster Back propagation learning: The RPROP algorithm. In: Proceeding of the IEEE International Conference on Neural Networks, San Francisco, CA, pp. 586–591 (1993)
Igel, C., Husken, M.: Improving the Rprop Learning Algorithm. In: Proceedings of the Second International Symposium on Neural Computation (NC 2000), pp. 115–121. ICSC Academic Press, London (2000)
Storn, R., Price, K.: DE - A simple and efficient heuristic for global optimization over continuous space. Journal of Global Optimization 11(4), 341–359 (1997)
Ao, Y.Y., Chi, H.Q.: A survey of multi-objective differential evolution algorithms. Journal of Computer Science and Frontiers 3(3), 234–246 (2009)
Chakraborty, U.K.: Advances in Differential Evolution. Springer, Heidelberg (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wang, Ys., Shi, Yj., Yue, Bx., Teng, Hf. (2010). An Efficient Differential Evolution Algorithm with Approximate Fitness Functions Using Neural Networks. In: Wang, F.L., Deng, H., Gao, Y., Lei, J. (eds) Artificial Intelligence and Computational Intelligence. AICI 2010. Lecture Notes in Computer Science(), vol 6320. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16527-6_42
Download citation
DOI: https://doi.org/10.1007/978-3-642-16527-6_42
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-16526-9
Online ISBN: 978-3-642-16527-6
eBook Packages: Computer ScienceComputer Science (R0)