Skip to main content

Implementing hebbian learning in a rank-based neural network

  • Part I: Coding and Learning in Biology
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

Abstract

Recent works have shown that biologically motivated net works of spiking neurons can potentially process information very quickly by encoding information in the latency at which different neurons fire, rather than by using frequency of firing as the code. In this paper, the relevant information is the rank vector of latency order of competing neurons. We propose here a Hebbian reinforcement, learning scheme to adjust the weights of a terminal layer of decision neurons in order to process this information. Then this learning rule is shown to be efficient in a simple pattern recognition task. We discuss in conclusion further extensions of that learning strategy for artificial vision.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. J.J. Hopfield. Pattern recognition computation using action potential tuning for stimulus representation. Nature (376):33–36, 1995

    Google Scholar 

  2. S.J. Thorpe. Spike arrival times: a highly efficient coding scheme for neural networks in “Parallel processing in neural system” R.Eckmiller, G.Hartrrian and G.Hauske (Eds.) North Holland: Elsevier. 91–94, 1990

    Google Scholar 

  3. S.J. Thorpe, D. Fize, C. Marlot. Speed of processing in the human visual system. Nature (381):520–522, 1996

    Google Scholar 

  4. S.J. Thorpe, J. Gautrais. How can the visual system process a natural scene in under150 ms? On the role of asynchronous spike propagation. In Proceeding of the 5th European Symposium on Artificial Neural Networks M. Verleysen (Eds.), Bruxelles: De Facto.(in press)79–84, 1997

    Google Scholar 

  5. M.V. Tsodyks, M.V. Markram. The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability. Proc Natl Acad Sci USA(94):719–723, 1997

    Google Scholar 

  6. F. Worgotter, R. Opara, K.Funke, U. Eysel. Using latency for object recognition in real and artificial neural networks. Neuroreport (7):741–744, 1996

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Samuelides, M., Thorpe, S., Veneau, E. (1997). Implementing hebbian learning in a rank-based neural network. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020147

Download citation

  • DOI: https://doi.org/10.1007/BFb0020147

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics