Original contribution
Convergence results in an associative memory model

https://doi.org/10.1016/0893-6080(88)90029-9Get rights and content

Abstract

This paper presents rigorous mathematical proofs for some observed convergence phenomena in an associative memory model introduced by Hopfield (based on Hebbian rules) for storing a number of random n-bit patterns. The capability of the model to correct a linear number of random errors in a bit pattern has been established earlier, but the existence of a large domain of attraction (correcting a linear number of arbitrary errors) has not been proved.

We present proofs for the following:

  • •

    • When m, the number of patterns stored, is less than n/(4 log n), the fundamental memories have a domain of attraction of radius ρn with ρ = 0.024, and the algorithm converges in time O (log log n).

  • •

    • When m = αn (with α small), all patterns within a distance ρn from a fundamental memory end up, in constant time, within a distance ϵn from the fundamental memory, where ϵ is about e−1/4α

We also extend somewhat Newman's description of the “energy landscape,” and prove the existence of an exponential number of stable states (extraneous memories) with convergence properties similar to those of the fundamental memories.

References (18)

There are more references available in the full text version of this article.

Cited by (68)

  • Fluctuations of the free energy in the high temperature Hopfield model

    2004, Stochastic Processes and their Applications
  • Learning pattern classification-A survey

    1998, IEEE Transactions on Information Theory
View all citing articles on Scopus
View full text