Skip to main content

Increasing the capacity of a hopfield network without sacrificing functionality

  • Part III: Learning: Theory and Algorithms
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

Abstract

Hopfield networks are commonly trained by one of two algorithms. The simplest of these is the Hebb rule, which has a low absolute capacity of n/(2ln n), where n is the total number of neurons. This capacity can be increased to n by using the pseudo-inverse rule. However, capacity is not the only consideration. It is important for rules to be local (the weight of a synapse depends ony on information available to the two neurons it connects), incremental (learning a new pattern can be done knowing only the old weight matrix and not the actual patterns stored) and immediate (the learning process is not a limit process). The Hebbian rule is all of these, but the pseudo-inverse is never incremental, and local only if not immediate. The question addressed by this paper is, ‘Can the capacity of the Hebbian rule be increased without losing locality, incrementality or immediacy?’

Here a new algorithm is proposed. This algorithm is local, immediate and incremental. In addition it has an absolute capacity significantly higher than that of the Hebbian method: n/√2ln n.

In this paper the new learning rule is introduced, and a heuristic calculation of the absolute capacity of the learning algorithm is given. Simulations show that this calculation does indeed provide a good measure of the capacity for finite network sizes. Comparisons are made between the Hebb rule and this new learning rule.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Daniel J. Amit, H. Gutfreund, and H. Sompolinsky. Statistical mechanics of neural networks near saturation. Annals of Physics, 173(1):30–67, 1987.

    Google Scholar 

  2. S. Diederich and M. Opper. Learning of correlated patterns in spin-glass networks by local learning rules. Physical Review Letters, 58(9):949–952, 1987.

    Google Scholar 

  3. V. S. Dotsenko, N. D. Yarunin, and E. A. Dorotheyev. Statistical mechanics of Hopfield-like neural networks with modified interactions. Journal of Physics A: Mathematical General, 24(10):2419–2429, 1991.

    Google Scholar 

  4. J. J. Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America: Biological Sciences, 79(8):2554–2558, 1982.

    Google Scholar 

  5. I. Kanter and H. Sompolinsky. Associative recall of memory without errors. Physical Review A-General Physics, 35(1):380–392, 1987.

    Google Scholar 

  6. R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh. The capacity of the Hopfield associative memory. IEEE Transactions on Information Theory, 33(4):461–482, 1987.

    Google Scholar 

  7. H. F. Yanai and S. I. Amari. Autoassociative memory with 2-stage dynamics of nonmonotonic neurons. IEEE Transactions on Neural Networks, 7(4):803–815, 1996.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Storkey, A. (1997). Increasing the capacity of a hopfield network without sacrificing functionality. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020196

Download citation

  • DOI: https://doi.org/10.1007/BFb0020196

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics