Skip to main content

Learning by back-propagation: Computing in a systolic way

  • Submitted Presentations
  • Conference paper
  • First Online:
PARLE '89 Parallel Architectures and Languages Europe (PARLE 1989)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 366))

  • 132 Accesses

Abstract

In this paper we present a systolic algorithm for back-propagation, a supervised, iteratived, gradient-descent, connectionist learning rule. The algorithm works on feedforward networks where connections can skip layers and fully exploits spatial and training parallelism, which are inherent to back-propagation. Spatial parallelism arises during the propagation of activity—forward—and error—backward—for a particular input-output. On the other hand, when this computation is carried out simultaneously for all input-output pairs, training parallelism is obtained. In the spatial dimension, a single systolic ring carries out sequentially the three main steps of the learning rule—forward, backward and weight increments update. Furthermore, the same pattern of matrix delivery is used in both the forward and the backward passes. In this manner, the algorithm preserves the similarity of the forward and backward passes in the original model. The resulting systolic algorithm is dual with respect to the pattern of matrix delivery—either columns or rows.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  • Fahlman, S.E. (1988). An Empirical Study of Learning Speed in Back-Propagation Networks. Technical Report CMU-CS-88-162. Dept. of Computer Science, Carnegie-Mellon University.

    Google Scholar 

  • Hinton, G.E. (1988). Connectionist Learning Procedures. Artificial Intelligence, to appear.

    Google Scholar 

  • Proc. of the IEEE International Conference on Neural Networks (1987). San Diego, CA.

    Google Scholar 

  • Proc. of the IEEE International Conference on Neural Networks (1988). San Diego, CA.

    Google Scholar 

  • Kung, S.Y. and Hwang, J.N. (1988). Parallel Architectures for Artificial Neural Nets. Proc. IEEE Second Int. Conf. on Neural Networks, vol. II, pp. 165–172.

    Google Scholar 

  • Millán, J. del R. and Bofill, P. (1989). Learning by Back-Propagation: a Systolic Algorithm and its Transputer Implementation. Technical Report LSI-89-15. Dept. de Llenguatges i Sistemes Informàtics, Universitat Politècnica de Catalunya.

    Google Scholar 

  • Navarro, J.J., Llaberia, J.M. and Valero, M. (1987). Partitioning: An Essential Step in Mapping Algorithms into Systolic Array Processors. IEEE Computer, 20(7), pp. 77–89.

    Google Scholar 

  • Parker, D.B. (1987). Optimal Algorithms for Adaptive Networks: Second Order Back Propagation, Second Order Direct Propagation, and Second Order Hebbian Learning. Proc. IEEE Int. Conf. on Neural Networks, pp. 593–600.

    Google Scholar 

  • Plaut, D.C., Nowlan, S.J. and Hinton, G.E. (1986). Experiments on Learning by Back Propagation. Technical Report CMUCS-86-126. Dept. of Computer Science, Carnegie-Mellon University.

    Google Scholar 

  • Pomerleau, D.A., Gusciora, G.L., Touretzky, D.S. and Kung, H.T. (1988). Neural Network Simulation at Warp Speed: How We Got 17 Million Connections per Second. Proc. IEEE Second Int. Conf. on Neural Networks, vol. II, pp. 143–150.

    Google Scholar 

  • Rumelhart, D.E., Hinton, G.E. and Williams, R.J. (1986). Learning Representations by Back-Propagating Errors. Nature, 323, pp. 533–536.

    Google Scholar 

  • Schmidhuber, J. (1988). Accelerated Learning in Back-Propagation Nets. Technical Report. Institut für Informatik, Technische Universität München.

    Google Scholar 

  • Torras, C. (1985). Temporal-Pattern Learning in Neural Models. Lecture Notes in Biomathematics 63. Springer-Verlag.

    Google Scholar 

  • Widrow, B. and Hoff, M.E. (1960): Adaptive Switching Circuits. Institute of Radio Engineers Western Electronic Show and Convention, Part 4, pp. 96–104.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Eddy Odijk Martin Rem Jean-Claude Syre

Rights and permissions

Reprints and permissions

Copyright information

© 1989 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Millán, J.d.R., Bofill, P. (1989). Learning by back-propagation: Computing in a systolic way. In: Odijk, E., Rem, M., Syre, JC. (eds) PARLE '89 Parallel Architectures and Languages Europe. PARLE 1989. Lecture Notes in Computer Science, vol 366. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-51285-3_44

Download citation

  • DOI: https://doi.org/10.1007/3-540-51285-3_44

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-51285-1

  • Online ISBN: 978-3-540-46184-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics