Skip to main content

An Efficient Differential Evolution Algorithm with Approximate Fitness Functions Using Neural Networks

  • Conference paper
Artificial Intelligence and Computational Intelligence (AICI 2010)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 6320))

Abstract

We develop an efficient differential evolution (DE) with neural networks-based approximating technique for computationally expensive problems, called DE-ANN hereinafter. We employ multilayer feedforward ANN to approximate the original problems for reducing the numbers of costly problems in DE. We also implement a fast training algorithm whose data samples use the population of DE. In the evolution process of DE, we combine the individual-based and generation-based methods for approximate model control. We compared the proposed algorithm with the conventional DE on three benchmark test functions. The experimental results showed that DE-ANN had capacity to be employed to deal with the computationally demanding real-world problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jin, Y.: A Comprehensive Survey of Fitness Approximation in Evolutionary Computation. Soft Computing 9(1), 3–12 (2005)

    Article  Google Scholar 

  2. Shi, L., Rasheed, K.: A Survey of Fitness Approximation Methods Applied in Evolutionary Algorithms. In: Tenne, Y., Goh, C.-K. (eds.) Computational Intel. in Expensive Opti. Prob., ALO, vol. 2, pp. 3–28. Springer, Heidelberg (2010)

    Google Scholar 

  3. Pelikan, M., Sastry, K.: Fitness Inheritance in the Bayesian Optimization Algorithm. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3103, pp. 48–59. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  4. Ong, Y.S., Nair, P.B., Keane, A.J., Wong, K.W.: Surrogate-Assisted Evolutionary Optimization Frameworks for High-Fidelity Engineering Design Problems. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation. Studies in Fuzziness and Soft Computing, pp. 307–332. Springer, Heidelberg (2004)

    Google Scholar 

  5. Takagi, H.: Interactive evolutionary computation. Proceedings of the IEEE, Fusion of the capabilities of EC optimization and human evaluation 89(9), 1275–1296 (2001)

    Google Scholar 

  6. Jin, Y., Sendhoff, B.: Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3102, pp. 688–699. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  7. Jin, Y., Hüsken, M., Olhofer, M., Sendhoff, B.: Neural networks for fitness approximation in evolutionary optimization. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 281–305. Springer, Berlin (2004)

    Google Scholar 

  8. Goel, T., Vaidyanathan, R., Haftka, R.T., et al.: Response surface approximation of Pareto optimal front in multi-objective optimization. Computer Methods in Applied Mechanics and Engineering 196(4-6), 879–893 (2007)

    Article  MATH  Google Scholar 

  9. Chung, H.-S., Alonso, J.J.: Multi-objective optimization using approximation model based genetic algorithms. Technical report 2004-4325, AIAA (2004)

    Google Scholar 

  10. Llora, X., Sastry, K.: Goldberg, et al.: Combating User Fatigue in iGAs: Partial Ordering, Support Vector Machines, and Synthetic Fitness. In: Proceedings of the 2005 Conference on Genetic and Evolutionary Computation, pp. 1363–1370 (2005)

    Google Scholar 

  11. Jin, Y., Sendhoff, B.: A Framework for Evolutionary Optimization with Approximate Fitness Functions. IEEE Transactions on Evolutionary Computation 6(5), 481–494 (2002)

    Article  Google Scholar 

  12. Rommel Regis, G., Christine Shoemaker, A.: Local Function Approximation in Evolutionary Algorithm for the optimization of Costly Functions. IEEE Transactions on Evolutionary Computation 8(5), 490–505 (2004)

    Article  Google Scholar 

  13. Zhou, Z.Z., Ong, Y.S., Lim, M.H., et al.: Memetic algorithm using multi-surrogates for computationally expensive optimization problems. Soft Comput. 11, 957–971 (2007)

    Article  Google Scholar 

  14. Storn, R.: System Design by Constraint Adaptation and Differential Evolution. IEEE Transactions on Evolutionary Computation 3(1), 22–34 (1999)

    Article  Google Scholar 

  15. Madavan, N.: Aerodynamic Shape Optimization Using Hybridized Differential Evolution. In: 21st AIAA Applied Aerodynamics Conference, Orlando, Florida, June 23-26. NASA Ames Research Center, Moffett Field, CA, AIAA-2003-3792 (2003)

    Google Scholar 

  16. Pawan, K.S., Nain, D.K.: Computationally effective search and optimization procedure using coarse-to-fine approximation. In: Proceedings of Congress on evolutionary computation (CEC 2003), Canberra, Australia, pp. 2081–2088 (2003)

    Google Scholar 

  17. Hassoun, M.H.: Fundamentals of Artificial Neural Networks. The MIT Press, Cambridge (1995)

    MATH  Google Scholar 

  18. Riedmiller, M., Braun, H.: A Direct Adaptive Method for Faster Back propagation learning: The RPROP algorithm. In: Proceeding of the IEEE International Conference on Neural Networks, San Francisco, CA, pp. 586–591 (1993)

    Google Scholar 

  19. Igel, C., Husken, M.: Improving the Rprop Learning Algorithm. In: Proceedings of the Second International Symposium on Neural Computation (NC 2000), pp. 115–121. ICSC Academic Press, London (2000)

    Google Scholar 

  20. Storn, R., Price, K.: DE - A simple and efficient heuristic for global optimization over continuous space. Journal of Global Optimization 11(4), 341–359 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  21. Ao, Y.Y., Chi, H.Q.: A survey of multi-objective differential evolution algorithms. Journal of Computer Science and Frontiers 3(3), 234–246 (2009)

    Google Scholar 

  22. Chakraborty, U.K.: Advances in Differential Evolution. Springer, Heidelberg (2008)

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, Ys., Shi, Yj., Yue, Bx., Teng, Hf. (2010). An Efficient Differential Evolution Algorithm with Approximate Fitness Functions Using Neural Networks. In: Wang, F.L., Deng, H., Gao, Y., Lei, J. (eds) Artificial Intelligence and Computational Intelligence. AICI 2010. Lecture Notes in Computer Science(), vol 6320. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16527-6_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-16527-6_42

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-16526-9

  • Online ISBN: 978-3-642-16527-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics