Abstract
This work investigates the benefits of using different distribution functions in the evolutionary learning algorithms with respect to Artificial Neural Networks’ (ANNs) generalization ability. We examine two modification of the recently proposed network weight-based evolutionary algorithm (NWEA), by mixing mutation strategies based on three distribution functions at the chromosome and the gene levels. The utilization of combined search strategies in the ANNs training implies that different step sizes determined by mixed distributions will direct the evolution towards good generalized ANNs.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Yao, X.: A review of evolutionary artificial neural networks. International Journal of Intelligent Systems 8(4), 539–567 (1993)
Yao, X.: Evolving artificial neural networks. In: Proceedings of the IEEE, pp. 1423–1447. IEEE Press, Los Alamitos (1999)
Rechenberg, I.: Cybernetic solution path of an experimental problem. In: Royal Aircraft Establishment, Farnborough, page Library Translation 1122 (1965)
Schwefel, H.-P.: Kybernetische Evolution als Strategie der experimentellen Forschung in der Strömungstechnik. Diplomarbeit, Technische Universität Berlin (1965)
Fogel, L.J.: Autonomous automata. Industrial Research 4, 14–19 (1962)
Fogel, L.J., Owens, A.J., Walsh, M.J.: Artificial intelligence through simulated evolution. Wiley, New York (1966)
Yao, X., Liu, Y.: An Analysis of evolutionary algorithms based on neighborhood and step size. In: Angeline, P.J., McDonnell, J.R., Reynolds, R.G., Eberhart, R. (eds.) EP 1997. LNCS, vol. 1213, pp. 297–307. Springer, Heidelberg (1997)
Yao, X., Liu, Y.: Fast Evolutionary Programming. In: Proc. of the Fifth Annual Conference on Evolutionary Programming, pp. 451–460. MIT Press, Cambridge (1996)
Yao, X., Liu, Y.: Evolutionary programming made faster. In: IEEE Transactions on Evolutionary Computation, vol. 3, pp. 82–102. IEEE Press, Los Alamitos (1999)
Davoian, K., Lippe, W.-M.: Including phenotype information in mutation to evolve artificial neural networks. In: Proc. of the IEEE International Joint Conference on Neural Networks (IJCNN 2007), Orlando, USA (2007)
Davoian, K., Lippe, W.-M.: Exploring the role of activation function type in evolutionary artificial neural networks. In: Proc. of the 2008 Int. Conference on Data Mining (DMIN 2008), pp. 443–449. CSREA Press, Las Vegas (2008)
Prechelt, L.: Proben1-A set of neural network benchmark problems and benchmarking rules. Fakultät für Informatik, Universät Karlsruhe, Germany, Tech. Rep. 21/94 (1994)
Yao, X., Liu, Y.: Scaling up evolutionary programming algorithms. In: Porto, V.W., Waagen, D. (eds.) EP 1998. LNCS, vol. 1447, pp. 103–112. Springer, Heidelberg (1998)
Lippe, W.-M.: Soft-Computing mit Neuronalen Netzen, Fuzzy-Logic und Evolutionären Algorithmen. Springer, Heidelberg (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Davoian, K., Lippe, WM. (2009). Mixing Different Search Biases in Evolutionary Learning Algorithms. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5768. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04274-4_12
Download citation
DOI: https://doi.org/10.1007/978-3-642-04274-4_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04273-7
Online ISBN: 978-3-642-04274-4
eBook Packages: Computer ScienceComputer Science (R0)