Abstract
This paper presents a new strategy in designing artificial neural networks. We call this strategy as adaptive merging and growing strategy (AMGS). Unlike most previous strategies on designing ANNs, AMGS puts emphasis on autonomous functioning in the design process. The new strategy reduces or increases an ANN size during training based on the learning ability of hidden neurons and the training progress of the ANN, respectively. It merges correlated hidden neurons to reduce the network size, while it splits existing hidden neuron to increase the network size. AMGS has been tested on designing ANNs for five benchmark classification problems, including Australian credit card assessment, diabetes, heart, iris, and thyroid problems. The experimental results show that the proposed strategy can design compact ANNs with good generalization ability.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Kwok, T.Y., Yeung, D.Y.: Constructive algorithms for structure learning in feedforward neural networks for regression problems. IEEE Transactions on Neural Networks 8, 630–645 (1997)
Reed, R.: Pruning algorithms - a survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)
Schaffer, J.D., Whitely, D., Eshelman, L.J.: Combinations of genetic algorithms and neural networks- a survey of the state of the art. In: Whitely, D., Schaffer, J.D. (eds.) International Workshop of Genetic Algorithms and Neural Networks, pp. 1–37. IEEE Computer Society Press, Los Alamitos (1992)
Odri, S.V., Petrovacki, D.P., Krstonosic, G.A.: Evolutional development of a multilevel neural network. Neural Networks 6, 583–595 (1993)
LeCun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems, vol. 2, pp. 598–605. Morgan Kaufmann, San Francisco (1990)
Hassibi, B., Stork, D.G.: Second-order derivatives for network pruning: optimal brain surgeon. In: Lee, C., Hanson, S., Cowan, J. (eds.) Advances in Neural Information Processing Systems, vol. 5, pp. 164–171. Morgan Kaufmann, San Mateo (1993)
Engelbretch, A.P.: A new pruning heuristic based on variance analysis of sensitivity information. IEEE Transaction on Neural Networks 12, 1386–1399 (2001)
Ludermir, T.B., Yamazaki, A., Zanchettin, C.: An optimization methodology for neural network weights and architectures. IEEE Transactions on Neural Networks 17, 1452–1459 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Islam, M.M., Sattar, M.A., Amin, M.F., Murase, K. (2008). A New Adaptive Strategy for Pruning and Adding Hidden Neurons during Training Artificial Neural Networks. In: Fyfe, C., Kim, D., Lee, SY., Yin, H. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2008. IDEAL 2008. Lecture Notes in Computer Science, vol 5326. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-88906-9_6
Download citation
DOI: https://doi.org/10.1007/978-3-540-88906-9_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-88905-2
Online ISBN: 978-3-540-88906-9
eBook Packages: Computer ScienceComputer Science (R0)