Skip to main content

Optimal Design of Neural Nets Using Hybrid Algorithms

  • Conference paper
PRICAI 2000 Topics in Artificial Intelligence (PRICAI 2000)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1886))

Included in the following conference series:

Abstract

Selection of the topology of a network and correct parameters for the learning algorithm is a tedious task for designing an optimal Artificial Neural Network (ANN), which is smaller, faster and with a better generalization performance. Genetic algorithm (GA) is an adaptive search technique based on the principles and mechanisms of natural selection and survival of the fittest from natural evolution. Simulated annealing (SA) is a global optimization algorithm that can process cost functions possessing quite arbitrary degrees of nonlinearities, discontinuities and stochasticity but statistically assuring a optimal solution. In this paper we explain how a hybrid algorithm integrating the desirable aspects of GA and SA can be applied for the optimal design of an ANN. This paper is more concerned with the understanding of current theoretical developments of Evolutionary Artificial Neural Networks (EANNs) using GAs and other heuristic procedures and how the proposed hybrid and other heuristic procedures can be combined to produce an optimal ANN.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Yao X.: Evolving Artificial Neural Networks, Proceedings of the IEEE, 87(9):1, 423–1447, (1999).

    Google Scholar 

  2. Hart W.E.: A Theoretical Comparison of Evolutionary Algorithms and Simulated Annealing, Proceedings of the Fifth Annual Conference on Evolutionary Programming. MIT press, (1996).

    Google Scholar 

  3. Frean M.: The Upstart Algorithm: A Method for Constructing and Training Feed Forward Neural Networks, Neural computations Volume 2, pp. 198–209, (1990).

    Article  Google Scholar 

  4. Mezard M., Nadal J.P.: Learning in Feed Forward Layered Networks: The Tiling Algorithm, Journal of Physics A, Vol 22, pp. 2191–2204, (1989).

    Article  MathSciNet  Google Scholar 

  5. Yao X.: A New Simulated Annealing Algorithm, International Journal of Computer Mathematics, 56:161–168, (1995).

    Article  MATH  Google Scholar 

  6. Boers E.J.W., Kuiper H., Happel B.L.M., Sprinkhuizen-Kuyper I.G.: Designing Modular Artificial Neural Networks, In: H.A. Wijshoff (ed.); Proceedings of Computing Science in The Netherlands, pp. 87–96, (1993).

    Google Scholar 

  7. Gutjahr S., Ragg T.: Automatic Determination of Optimal Network Topologies Based on Information Theory and Evolution, IEEE Proceedings of the 23rd EUROMICRO Conference, (1997).

    Google Scholar 

  8. Schiffmann W., Joost M., Werner R.: Comparison of Optimized Backpropagation Algorithms, Proceedings. Of the European Symposium on Artificial Neural Networks, Brussels, pp. 97–104, (1993).

    Google Scholar 

  9. Mascioli F., Martinelli G.: A Constructive Algorithm for Binary Neural Networks: The Oil Spot Algorithm, IEEE Transaction on Neural Networks, 6(3), pp 794–797, (1995).

    Article  Google Scholar 

  10. Porto V.W., Fogel D.B., Fogel L.J.: Alternative Neural Network Training Methods, IEEE Expert, volume 10, no.4, pp. 16–22, (1995).

    Article  Google Scholar 

  11. Topchy A.P., Lebedko O.A.: Neural Network Training by Means of Cooperative Evolutionary Search, Nuclear Instruments & Methods In Physics Research, Section A: accelerators, Spectrometers, Detectors and Associated equipment, Volume 389, no. 1–2, pp. 240–241, (1997).

    Article  Google Scholar 

  12. Polani D., Miikkulainen R.: Fast Reinforcement Learning Through Eugenic Neuro-Evolution. Technical Report AI99-277, Department of Computer Sciences, University of Texas at Austin, (1999).

    Google Scholar 

  13. Kitano H.: Designing Neural Networks Using Genetic Algorithms with Graph Generation System, Complex Systems, Volume 4, No.4, pp. 461–476, (1990).

    MATH  Google Scholar 

  14. Price K.V.: Genetic Annealing, Dr. Dobbs Journal, Vol.220, pp. 127–132, (1994).

    Google Scholar 

  15. Stepniewski S.W., Keane A.J.: Pruning Back-propagation Neural Networks Using Modern Stochastic Optimization Techniques, Neural Computing & Applications, Vol. 5, pp. 76–98, (1997).

    Article  Google Scholar 

  16. Fullmer B., Miikkulainen R.: Using Marker-Based Genetic Encoding of Neural Networks To Evolve Finite-State Behavior, Proceedings of the First European Conference on Artificial Life, France), pp.255–262, (1992).

    Google Scholar 

  17. Gruau F.: Genetic Synthesis of Modular Neural Networks, In S Forrest (Ed.) Genetic Algorithms: Proceedings of the 5th International Conference, Morgan Kaufman, (1993).

    Google Scholar 

  18. Merril J.W.L., Port R.F.: Fractally Configured Neural Networks, Neural Networks, Vol 4, No.1, pp 53–60, (1991).

    Article  Google Scholar 

  19. Kim H.B., Jung S.H., Kim T.G., Park K.H: Fast Learning Method for Back-Propagation Neural Network by Evolutionary Adaptation of Learning Rates, Neurocomputing, vol. 11, no.1, pp. 101–106, (1996).

    Article  MATH  Google Scholar 

  20. Goldberg D.E.: Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley Publishing Company, Inc., (1989).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abraham, A., Nath, B. (2000). Optimal Design of Neural Nets Using Hybrid Algorithms. In: Mizoguchi, R., Slaney, J. (eds) PRICAI 2000 Topics in Artificial Intelligence. PRICAI 2000. Lecture Notes in Computer Science(), vol 1886. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44533-1_52

Download citation

  • DOI: https://doi.org/10.1007/3-540-44533-1_52

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67925-7

  • Online ISBN: 978-3-540-44533-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics