Skip to main content

Mixing Different Search Biases in Evolutionary Learning Algorithms

  • Conference paper
  • 1970 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5768))

Abstract

This work investigates the benefits of using different distribution functions in the evolutionary learning algorithms with respect to Artificial Neural Networks’ (ANNs) generalization ability. We examine two modification of the recently proposed network weight-based evolutionary algorithm (NWEA), by mixing mutation strategies based on three distribution functions at the chromosome and the gene levels. The utilization of combined search strategies in the ANNs training implies that different step sizes determined by mixed distributions will direct the evolution towards good generalized ANNs.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Yao, X.: A review of evolutionary artificial neural networks. International Journal of Intelligent Systems 8(4), 539–567 (1993)

    Article  MathSciNet  Google Scholar 

  2. Yao, X.: Evolving artificial neural networks. In: Proceedings of the IEEE, pp. 1423–1447. IEEE Press, Los Alamitos (1999)

    Google Scholar 

  3. Rechenberg, I.: Cybernetic solution path of an experimental problem. In: Royal Aircraft Establishment, Farnborough, page Library Translation 1122 (1965)

    Google Scholar 

  4. Schwefel, H.-P.: Kybernetische Evolution als Strategie der experimentellen Forschung in der Strömungstechnik. Diplomarbeit, Technische Universität Berlin (1965)

    Google Scholar 

  5. Fogel, L.J.: Autonomous automata. Industrial Research 4, 14–19 (1962)

    Google Scholar 

  6. Fogel, L.J., Owens, A.J., Walsh, M.J.: Artificial intelligence through simulated evolution. Wiley, New York (1966)

    MATH  Google Scholar 

  7. Yao, X., Liu, Y.: An Analysis of evolutionary algorithms based on neighborhood and step size. In: Angeline, P.J., McDonnell, J.R., Reynolds, R.G., Eberhart, R. (eds.) EP 1997. LNCS, vol. 1213, pp. 297–307. Springer, Heidelberg (1997)

    Chapter  Google Scholar 

  8. Yao, X., Liu, Y.: Fast Evolutionary Programming. In: Proc. of the Fifth Annual Conference on Evolutionary Programming, pp. 451–460. MIT Press, Cambridge (1996)

    Google Scholar 

  9. Yao, X., Liu, Y.: Evolutionary programming made faster. In: IEEE Transactions on Evolutionary Computation, vol. 3, pp. 82–102. IEEE Press, Los Alamitos (1999)

    Google Scholar 

  10. Davoian, K., Lippe, W.-M.: Including phenotype information in mutation to evolve artificial neural networks. In: Proc. of the IEEE International Joint Conference on Neural Networks (IJCNN 2007), Orlando, USA (2007)

    Google Scholar 

  11. Davoian, K., Lippe, W.-M.: Exploring the role of activation function type in evolutionary artificial neural networks. In: Proc. of the 2008 Int. Conference on Data Mining (DMIN 2008), pp. 443–449. CSREA Press, Las Vegas (2008)

    Google Scholar 

  12. Prechelt, L.: Proben1-A set of neural network benchmark problems and benchmarking rules. Fakultät für Informatik, Universät Karlsruhe, Germany, Tech. Rep. 21/94 (1994)

    Google Scholar 

  13. Yao, X., Liu, Y.: Scaling up evolutionary programming algorithms. In: Porto, V.W., Waagen, D. (eds.) EP 1998. LNCS, vol. 1447, pp. 103–112. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  14. Lippe, W.-M.: Soft-Computing mit Neuronalen Netzen, Fuzzy-Logic und Evolutionären Algorithmen. Springer, Heidelberg (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Davoian, K., Lippe, WM. (2009). Mixing Different Search Biases in Evolutionary Learning Algorithms. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5768. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04274-4_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04274-4_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04273-7

  • Online ISBN: 978-3-642-04274-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics