Skip to main content

The Kernel Hopfield Memory Network

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3305))

Abstract

The kernel theory drawn from the work on learning machines is applied to the Hopfield neural network. This provides a new insight into the workings of the neural network as associative memory. The kernel “trick” defines an embedding of memory patterns into (higher or infinite dimensional) memory feature vectors and the training of the network is carried out in this feature space. The generalization of the network by using the kernel theory improves its performance in three aspects. First, an adequate kernel selection enables the satisfaction of the condition that any set of memory patterns be attractors of the network dynamics. Second, the basins of attraction of the memory patterns are enhanced improving the recall capacity. Third, since the memory patterns are mapped into a higher dimensional feature space the memory capacity density is effectively increased. These aspects are experimentally demonstrated on sets of random memory patterns.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hopfield, J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79, 2554 (1982)

    Article  MathSciNet  Google Scholar 

  2. Muller, B., Reinhardt, J.: Neural Networks. An Introduction. Springer, Berlin (1990)

    Google Scholar 

  3. Hertz, J., Krogh, A., Palmer, R.: Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City (1991)

    Google Scholar 

  4. Personnaz, L., Guyon, I., Dreyfus, J.: Collective computational properties of neural networks: new learning mechanisms. J. Physique Lett. 16, 1359 (1985)

    Google Scholar 

  5. Diederich, S., Opper, M.: Learning of correlated patterns in spin-glass networks by local learning rules. Phys. Rev. Lett. 58, 949 (1987)

    Article  MathSciNet  Google Scholar 

  6. Krauth, W., Mezard, M.: Learning algorithms with optimal stability in neural networks. J. Phys. A 20, 1745 (1987)

    Article  MathSciNet  Google Scholar 

  7. Anlauf, J., Biehl, M.: The adatron - an adaptive perceptron algorithm. Europhysics Letters 10, 687–692 (1989)

    Article  Google Scholar 

  8. Opper, M.: Learning times of neural networks: exact solution for a perceptron algorithm. Phys. Rev. A 38, 3824 (1988)

    Article  Google Scholar 

  9. Gardner, E.: The space of interactions in neural network models. J. Phys. A 21, 257 (1988)

    Article  MathSciNet  Google Scholar 

  10. Gardner, E., Derrida, B.: Optimal storage properties of neural network models. J. Phys. A 21, 271 (1988)

    Article  MathSciNet  Google Scholar 

  11. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2000)

    Google Scholar 

  12. Friest, T., Campbell, C., Cristianini, N.: The kernel-adatron: A fast and simple learning procedure for support vector machines. In: Proceedings of the Fifteenth International Conference on Machine Learning. Morgan-Kaufmann, San Francisco (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

García, C., Moreno, J.A. (2004). The Kernel Hopfield Memory Network. In: Sloot, P.M.A., Chopard, B., Hoekstra, A.G. (eds) Cellular Automata. ACRI 2004. Lecture Notes in Computer Science, vol 3305. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30479-1_78

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-30479-1_78

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-23596-5

  • Online ISBN: 978-3-540-30479-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics