Skip to main content

Improved convergence rate of back-propagation with dynamic adaption of the learning rate

  • Neural Networks
  • Conference paper
  • First Online:
Parallel Problem Solving from Nature (PPSN 1990)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 496))

Included in the following conference series:

Abstract

This article deals with back-propagation, a learning method for neural nets. It is shown, first, how the introduction of test cycles brings out a great improvement in the convergence rate and, second, that costly experiments used to adjust learning-relevant parameters could be dispensed with.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Scott E. Fahlmann. Faster-learning variations on back-propagation: An empirical study. In David Touretzky, Geoffrey Hinton, and Terrence Sejnowski, editors, Proceedings of the 1988 Connectionist Models Summer School, pages 38–51, San Mateo, CA, 1989. Morgan Kaufmann Publishers.

    Google Scholar 

  2. Robert A. Jacobs. Increased rates of convergence through learning rate adaption. Neural Networks, I:295–307, 1988.

    Article  Google Scholar 

  3. Alan H. Kramer and A. Sangiovanni-Vincentelli. Efficient parallel learning algorithms for neural networks. IEEE Computer, 1987.

    Google Scholar 

  4. Kevin J. Lang and Michael J. Witbrock. Learning to tell two spirals apart. In David Touretzky, Geoffrey Hinton, and Terrence Sejnowski, editors, Proceedings of the 1988 Connectionist Models Summer School, pages 56–59, San Mateo, CA, 1989. Morgan Kaufmann Publishers.

    Google Scholar 

  5. D. R. Rush and J. M. Salas. Improving the learning rate of backpropagation with the gradient reuse algorithm. In IEEE International Conference on Neural Networks, pages I-441, San Diego, CA, 1988. The Institute of Electrical and Electronic Engineers, Inc., IEEE San Diego Section and IEEE TAB Neural Network Committee.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Hans-Paul Schwefel Reinhard Männer

Rights and permissions

Reprints and permissions

Copyright information

© 1991 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Salomon, R. (1991). Improved convergence rate of back-propagation with dynamic adaption of the learning rate. In: Schwefel, HP., Männer, R. (eds) Parallel Problem Solving from Nature. PPSN 1990. Lecture Notes in Computer Science, vol 496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0029763

Download citation

  • DOI: https://doi.org/10.1007/BFb0029763

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-54148-6

  • Online ISBN: 978-3-540-70652-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics