Abstract:
The generalization ability of an MLP network has been shown to be related to both the number and magnitudes of the network weights. Thus, there exists a tension between e...Show MoreMetadata
Abstract:
The generalization ability of an MLP network has been shown to be related to both the number and magnitudes of the network weights. Thus, there exists a tension between employing networks with few weights that have relatively large magnitudes, and networks with a greater number of weights with relatively small magnitudes. The analysis presented in this paper indicates that large magnitudes for network weights potentially increase the propensity of a network to interpolate poorly. Experimental results indicate that when bounds are imposed on network weights, the backpropagation algorithm is capable of discovering networks with small weight magnitudes that retain their expressive power and exhibit good generalization.
Date of Conference: 31 July 2005 - 04 August 2005
Date Added to IEEE Xplore: 27 December 2005
Print ISBN:0-7803-9048-2