Loading [a11y]/accessibility-menu.js
A new sensitivity-based pruning technique for feed-forward neural networks that improves generalization | IEEE Conference Publication | IEEE Xplore
Scheduled Maintenance: On Monday, 27 January, the IEEE Xplore Author Profile management portal will undergo scheduled maintenance from 9:00-11:00 AM ET (1400-1600 UTC). During this time, access to the portal will be unavailable. We apologize for any inconvenience.

A new sensitivity-based pruning technique for feed-forward neural networks that improves generalization


Abstract:

Multi-layer neural networks of the back-propagation type (MLP-networks) became a well-established tool used in various application areas. Reliable solutions require, howe...Show More

Abstract:

Multi-layer neural networks of the back-propagation type (MLP-networks) became a well-established tool used in various application areas. Reliable solutions require, however, also sufficient generalization capabilities of the formed networks and an easy interpretation of their function. These characteristics are strongly related to less sensitive networks with an optimized network structure. In this paper, we will introduce a new pruning technique called SCGSIR that is inspired by the fast method of scaled conjugate gradients (SCG) and sensitivity analysis. Network sensitivity inhibited during training impacts efficient optimization of network structure. Experiments performed so far yield promising results outperforming the reference techniques when considering both their ability to find networks with optimum architecture and improved generalization.
Date of Conference: 31 July 2011 - 05 August 2011
Date Added to IEEE Xplore: 03 October 2011
ISBN Information:

ISSN Information:

Conference Location: San Jose, CA, USA

References

References is not available for this document.