Abstract
Like other gradient descent techniques, backpropagation converges slowly, even for medium sized network problems. This fact results from the usually large dimension of the weight space and from the particular shape of the error surface in each iteration point. Oscillation between the sides of deep and narrow valleys, for example, is a well known case where gradient descent provides poor convergence rates.
In this work, we present an acceleration technique for the backpropagation algorithm based on individual adaptation of the learning rate parameter of each synapse. The efficiency of the method is discussed and several related issues are analyzed.
Preview
Unable to display preview. Download preview PDF.
References
Richard S. Sutton. “Two Problems With Backpropagation and other Steepest-Descent Learning Procedures for Networks”, Proceedings of the 8th. Annual Conf. of the Cognitive Science Society, 1986, pp 823–831.
D. E. Rumelhart, G. E. Hinton, and R. J. Williams. “Learning Internal Representations by Error Propagation”, in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Cambridge, MA: MIT Press, 1986.
L-W. Chan and F. Fallside. “An Adaptive Training Algorithm for Back Propagation Networks”, Computer Speech & Language, vol. 2., Sep/Dec. 1987, pp 205–218.
H. Kesten. “Accelerated Stochastic Approximation”, Annals of Mathematical Statistics, 29, 1957, pp 41–59
R. Jacobs. “Increased Rates of Convergence Trough Learning Rate Adaptation”, Neural Networks, Vol 1, N. 4, 1988.
S. Fahlman. “Faster-Learning Variations on Back-Propagation: An Empirical Study”, Proc. of the 1988 Connectionist Models Summer School, Carnegie Mellon, 1988.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1990 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Silva, F.M., Almeida, L.B. (1990). Acceleration techniques for the backpropagation algorithm. In: Almeida, L.B., Wellekens, C.J. (eds) Neural Networks. EURASIP 1990. Lecture Notes in Computer Science, vol 412. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-52255-7_32
Download citation
DOI: https://doi.org/10.1007/3-540-52255-7_32
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-52255-3
Online ISBN: 978-3-540-46939-1
eBook Packages: Springer Book Archive