Abstract
Selection of the topology of a network and correct parameters for the learning algorithm is a tedious task for designing an optimal Artificial Neural Network (ANN), which is smaller, faster and with a better generalization performance. Genetic algorithm (GA) is an adaptive search technique based on the principles and mechanisms of natural selection and survival of the fittest from natural evolution. Simulated annealing (SA) is a global optimization algorithm that can process cost functions possessing quite arbitrary degrees of nonlinearities, discontinuities and stochasticity but statistically assuring a optimal solution. In this paper we explain how a hybrid algorithm integrating the desirable aspects of GA and SA can be applied for the optimal design of an ANN. This paper is more concerned with the understanding of current theoretical developments of Evolutionary Artificial Neural Networks (EANNs) using GAs and other heuristic procedures and how the proposed hybrid and other heuristic procedures can be combined to produce an optimal ANN.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Yao X.: Evolving Artificial Neural Networks, Proceedings of the IEEE, 87(9):1, 423–1447, (1999).
Hart W.E.: A Theoretical Comparison of Evolutionary Algorithms and Simulated Annealing, Proceedings of the Fifth Annual Conference on Evolutionary Programming. MIT press, (1996).
Frean M.: The Upstart Algorithm: A Method for Constructing and Training Feed Forward Neural Networks, Neural computations Volume 2, pp. 198–209, (1990).
Mezard M., Nadal J.P.: Learning in Feed Forward Layered Networks: The Tiling Algorithm, Journal of Physics A, Vol 22, pp. 2191–2204, (1989).
Yao X.: A New Simulated Annealing Algorithm, International Journal of Computer Mathematics, 56:161–168, (1995).
Boers E.J.W., Kuiper H., Happel B.L.M., Sprinkhuizen-Kuyper I.G.: Designing Modular Artificial Neural Networks, In: H.A. Wijshoff (ed.); Proceedings of Computing Science in The Netherlands, pp. 87–96, (1993).
Gutjahr S., Ragg T.: Automatic Determination of Optimal Network Topologies Based on Information Theory and Evolution, IEEE Proceedings of the 23rd EUROMICRO Conference, (1997).
Schiffmann W., Joost M., Werner R.: Comparison of Optimized Backpropagation Algorithms, Proceedings. Of the European Symposium on Artificial Neural Networks, Brussels, pp. 97–104, (1993).
Mascioli F., Martinelli G.: A Constructive Algorithm for Binary Neural Networks: The Oil Spot Algorithm, IEEE Transaction on Neural Networks, 6(3), pp 794–797, (1995).
Porto V.W., Fogel D.B., Fogel L.J.: Alternative Neural Network Training Methods, IEEE Expert, volume 10, no.4, pp. 16–22, (1995).
Topchy A.P., Lebedko O.A.: Neural Network Training by Means of Cooperative Evolutionary Search, Nuclear Instruments & Methods In Physics Research, Section A: accelerators, Spectrometers, Detectors and Associated equipment, Volume 389, no. 1–2, pp. 240–241, (1997).
Polani D., Miikkulainen R.: Fast Reinforcement Learning Through Eugenic Neuro-Evolution. Technical Report AI99-277, Department of Computer Sciences, University of Texas at Austin, (1999).
Kitano H.: Designing Neural Networks Using Genetic Algorithms with Graph Generation System, Complex Systems, Volume 4, No.4, pp. 461–476, (1990).
Price K.V.: Genetic Annealing, Dr. Dobbs Journal, Vol.220, pp. 127–132, (1994).
Stepniewski S.W., Keane A.J.: Pruning Back-propagation Neural Networks Using Modern Stochastic Optimization Techniques, Neural Computing & Applications, Vol. 5, pp. 76–98, (1997).
Fullmer B., Miikkulainen R.: Using Marker-Based Genetic Encoding of Neural Networks To Evolve Finite-State Behavior, Proceedings of the First European Conference on Artificial Life, France), pp.255–262, (1992).
Gruau F.: Genetic Synthesis of Modular Neural Networks, In S Forrest (Ed.) Genetic Algorithms: Proceedings of the 5th International Conference, Morgan Kaufman, (1993).
Merril J.W.L., Port R.F.: Fractally Configured Neural Networks, Neural Networks, Vol 4, No.1, pp 53–60, (1991).
Kim H.B., Jung S.H., Kim T.G., Park K.H: Fast Learning Method for Back-Propagation Neural Network by Evolutionary Adaptation of Learning Rates, Neurocomputing, vol. 11, no.1, pp. 101–106, (1996).
Goldberg D.E.: Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley Publishing Company, Inc., (1989).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Abraham, A., Nath, B. (2000). Optimal Design of Neural Nets Using Hybrid Algorithms. In: Mizoguchi, R., Slaney, J. (eds) PRICAI 2000 Topics in Artificial Intelligence. PRICAI 2000. Lecture Notes in Computer Science(), vol 1886. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44533-1_52
Download citation
DOI: https://doi.org/10.1007/3-540-44533-1_52
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67925-7
Online ISBN: 978-3-540-44533-3
eBook Packages: Springer Book Archive