Abstract
In this paper we present a systolic algorithm for back-propagation, a supervised, iteratived, gradient-descent, connectionist learning rule. The algorithm works on feedforward networks where connections can skip layers and fully exploits spatial and training parallelism, which are inherent to back-propagation. Spatial parallelism arises during the propagation of activity—forward—and error—backward—for a particular input-output. On the other hand, when this computation is carried out simultaneously for all input-output pairs, training parallelism is obtained. In the spatial dimension, a single systolic ring carries out sequentially the three main steps of the learning rule—forward, backward and weight increments update. Furthermore, the same pattern of matrix delivery is used in both the forward and the backward passes. In this manner, the algorithm preserves the similarity of the forward and backward passes in the original model. The resulting systolic algorithm is dual with respect to the pattern of matrix delivery—either columns or rows.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fahlman, S.E. (1988). An Empirical Study of Learning Speed in Back-Propagation Networks. Technical Report CMU-CS-88-162. Dept. of Computer Science, Carnegie-Mellon University.
Hinton, G.E. (1988). Connectionist Learning Procedures. Artificial Intelligence, to appear.
Proc. of the IEEE International Conference on Neural Networks (1987). San Diego, CA.
Proc. of the IEEE International Conference on Neural Networks (1988). San Diego, CA.
Kung, S.Y. and Hwang, J.N. (1988). Parallel Architectures for Artificial Neural Nets. Proc. IEEE Second Int. Conf. on Neural Networks, vol. II, pp. 165–172.
Millán, J. del R. and Bofill, P. (1989). Learning by Back-Propagation: a Systolic Algorithm and its Transputer Implementation. Technical Report LSI-89-15. Dept. de Llenguatges i Sistemes Informàtics, Universitat Politècnica de Catalunya.
Navarro, J.J., Llaberia, J.M. and Valero, M. (1987). Partitioning: An Essential Step in Mapping Algorithms into Systolic Array Processors. IEEE Computer, 20(7), pp. 77–89.
Parker, D.B. (1987). Optimal Algorithms for Adaptive Networks: Second Order Back Propagation, Second Order Direct Propagation, and Second Order Hebbian Learning. Proc. IEEE Int. Conf. on Neural Networks, pp. 593–600.
Plaut, D.C., Nowlan, S.J. and Hinton, G.E. (1986). Experiments on Learning by Back Propagation. Technical Report CMUCS-86-126. Dept. of Computer Science, Carnegie-Mellon University.
Pomerleau, D.A., Gusciora, G.L., Touretzky, D.S. and Kung, H.T. (1988). Neural Network Simulation at Warp Speed: How We Got 17 Million Connections per Second. Proc. IEEE Second Int. Conf. on Neural Networks, vol. II, pp. 143–150.
Rumelhart, D.E., Hinton, G.E. and Williams, R.J. (1986). Learning Representations by Back-Propagating Errors. Nature, 323, pp. 533–536.
Schmidhuber, J. (1988). Accelerated Learning in Back-Propagation Nets. Technical Report. Institut für Informatik, Technische Universität München.
Torras, C. (1985). Temporal-Pattern Learning in Neural Models. Lecture Notes in Biomathematics 63. Springer-Verlag.
Widrow, B. and Hoff, M.E. (1960): Adaptive Switching Circuits. Institute of Radio Engineers Western Electronic Show and Convention, Part 4, pp. 96–104.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1989 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Millán, J.d.R., Bofill, P. (1989). Learning by back-propagation: Computing in a systolic way. In: Odijk, E., Rem, M., Syre, JC. (eds) PARLE '89 Parallel Architectures and Languages Europe. PARLE 1989. Lecture Notes in Computer Science, vol 366. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-51285-3_44
Download citation
DOI: https://doi.org/10.1007/3-540-51285-3_44
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-51285-1
Online ISBN: 978-3-540-46184-5
eBook Packages: Springer Book Archive