Abstract
The widespread use of artificial neural networks and the difficult work regarding the correct specification (tuning) of parameters for a given problem are the main aspects that motivated the approach purposed in this paper. This approach employs an evolutionary search to perform the simultaneous tuning of initial weights, transfer functions, architectures and learning rules (learning algorithm parameters). Experiments were performed and the results demonstrate that the method is able to find efficient networks with satisfactory generalization in a shorter search time.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Abraham, A.: Meta learning evolutionary artificial neural networks. Neurocomputing 56, 1–38 (2004)
Almeida, L.M., Ludermir, T.B.: Automatically searching near-optimal artificial neural networks. In: 15th European Symposium on Artificial Neural Networks, pp. 549–554 (2007)
Almeida, L.M., Ludermir, T.B.: An improved method for automatically searching near-optimal artificial neural networks. In: International Joint Conference on Neural Networks (2008)
Cantú-Paz, E., Kamath, C.: An empirical comparison of combinations of evolutionary algorithms and neural networks for classification problems. IEEE Transactions on Systems, Man, and Cybernetics, Part B 35(5), 915–927 (2005)
Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Heidelberg (2003)
Ferentinos, K.P.: Biological engineering applications of feed-forward neural networks designed and parameterized by genetic algorithms. Neural Networks 18(7), 934–950 (2005)
GarcÃa-Pedrajas, N., Hervás-MartÃnez, C., Ortiz-Boyer, D.: Cooperative co-evolution of artificial neural network ensembles for pattern classification. IEEE Transaction on Evolutionary Computation 9(3), 271–302 (2005)
GarcÃa-Pedrajas, N., Ortiz-Boyer, D., Hervás-Mart´ınez, C.: Cooperative co-evolution of generalized multilayer perceptrons. Neurocomputing 56, 257–283 (2004)
Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice-Hall, Englewood Cliffs (1999)
Islam, M.M., Murase, K.: A new algorithm to design compact two-hidden-layer artificial neural networks. Neural Networks 14(9), 1265–1278 (2001)
Ma, L., Khorasani, K.: New training strategies for constructive neural networks with application to regression problems. Neural Networks 17(4), 589–609 (2004)
Asuncion, A., Newman, D.: UCI machine learning repository, University of California, Irvine, School of Information and Computer Sciences (2007), http://mlearn.ics.uci.edu/MLRepository.html
Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Almeida, L.M., Ludermir, T.B. (2008). An Evolutionary Approach for Tuning Artificial Neural Network Parameters. In: Corchado, E., Abraham, A., Pedrycz, W. (eds) Hybrid Artificial Intelligence Systems. HAIS 2008. Lecture Notes in Computer Science(), vol 5271. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87656-4_20
Download citation
DOI: https://doi.org/10.1007/978-3-540-87656-4_20
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-87655-7
Online ISBN: 978-3-540-87656-4
eBook Packages: Computer ScienceComputer Science (R0)