Skip to main content

Generalization Capability of Artificial Neural Network Incorporated with Pruning Method

  • Conference paper
Advanced Computing, Networking and Security (ADCONS 2011)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7135))

Abstract

In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Blumer, A., Ehrenfeucht, A., Hausseler, D., Warmuth, M.: Learnability and the Vapnik-Chervonenkis dimension. J. Ass. Comput. Mach. 36(4), 929–965 (1989)

    Article  MATH  Google Scholar 

  2. Lawrence, S., Giles, C.L., Tsoi, A.C.: What Size Neural Network Gives Optimal Generalization? Convergence Properties of Back propagation. Technical Report, UMIACS-TR-96-22 and CS-TR-3617, Institute for Advanced Computer Studies University of Maryland, College Park, MD 20742 (1996)

    Google Scholar 

  3. Lawrence, S., Giles, L.C.: Overfitting and Neural Networks: Conjugate Gradient and Backpropagation. In: IEEE IJCNN, Italy, CA, July 24-27, pp. 114–119 (2000)

    Google Scholar 

  4. Reed, R.: Pruning Algorithms- A Survey. IEEE Trans. on Neural Network 4(5), 740–749 (1993)

    Article  Google Scholar 

  5. Cun, Y.L., Denker, J.S., Solla, S.A.: Optima Brain Damage. In: Denver (ed.) Advances in Neural Information Processing (2), pp. 598–605 (1989/1990); D.S. Touretzky

    Google Scholar 

  6. Hassibi, B., Stork, D.G.: Second-order derivatives for network pruning: Optimal Brain Surgeon. Advances in Neural Information Processing System 5, 164–171 (1993)

    Google Scholar 

  7. Karnin, E.D.: A Simple Procedure for Pruning Back-Propagation Trained Neural Networks. IEEE Transaction on Neural Network 1(2), 239–245 (1990)

    Article  Google Scholar 

  8. Kavzoglu, T., Mather, P.M.: Assessing Artificial Neural Network Pruning Algorithms. In: Proceedings of the 24th Annual Conference and Exhibition of the Remote Sensing Society, Greenwich, UK, pp. 603–609 (1998)

    Google Scholar 

  9. Kavzoglu, T., Vieria, C.A.O.: An Analysis of Artificial Neural Network Pruning Algorithms in Relation to Land Cover Classification Accuracy. In: Proceedings of the Remote Sensing Society Student Conference, Oxford, UK, pp. 53–58 (1998)

    Google Scholar 

  10. Giorgio, C., Guariso, G.: An application of pruning in the design of neural networks for real time flood forecasting. J. Neural Comp. and Appl. 14(1), 66–77 (2005)

    Article  Google Scholar 

  11. Galerne, P., Yao, G.K.: Burel: New Neural Network Pruning and its application to sonar Imagery. In: Conference IEEE-CESA 1998 (1998)

    Google Scholar 

  12. Kenji, S., Horiba, I., Sugie, N.: A Simple Neural Network Pruning Algorithm with Application to Filter Synthesis. Neural Processing Letters 13(1), 43–53 (2001)

    Article  MATH  Google Scholar 

  13. Vapnik, V.N.: Estimation of Dependencies Based on Empirical Data. Springer, Berlin (1982)

    Google Scholar 

  14. Bengio, Y.: Neural Networks for Speech and Sequence Recognition. Thomson (1996)

    Google Scholar 

  15. Urolagin, S., Prema, K.V., Reddy, N.V.S.: Applying bottom up neural engineering approach to handwritten number recognition. In: International Conference on Cognition and Recognition, India, December 22-23, pp. 469–476 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Urolagin, S., K.V., P., Reddy, N.V.S. (2012). Generalization Capability of Artificial Neural Network Incorporated with Pruning Method. In: Thilagam, P.S., Pais, A.R., Chandrasekaran, K., Balakrishnan, N. (eds) Advanced Computing, Networking and Security. ADCONS 2011. Lecture Notes in Computer Science, vol 7135. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29280-4_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-29280-4_19

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-29279-8

  • Online ISBN: 978-3-642-29280-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics