Skip to main content

Application of Hopfield-like Neural Networks to Nonlinear Factorization

  • Conference paper
Compstat
  • 1099 Accesses

Abstract

The problem of binary factorization of complex patterns in recurrent Hopfield-like neural network was studied by means of computer simulation. The network ability to perform a factorization was analyzed depending on the number and sparseness of factors mixed in presented patterns. Binary factorization in sparsely encoded Hopfield-like neural network is treated as efficient statistical method and as a functional model of hippocampal CA3 field

*This work was supported by grants Grant Agency of Czech Republic No. 201/01/1192 and 201/00/1031

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Amit, D.J., Gutfreund, H. & Sompolinsky, H. (1987). Statistical mechanics of neural networks near saturation. Ann. Phys.,173, 30–67.

    Article  Google Scholar 

  • Amari, S. & Maginu, K. (1988). Statistical neurodynamics of associative memory. Neural Networks, 1, 63–73.

    Article  Google Scholar 

  • Amari, S. (1989). Characteristics of sparsely encoded associative memory. Neural Networks, 2, 451–457.

    Article  Google Scholar 

  • Buzsaki, G. (1996). Hippocampo-neocortical dialogue. Cerebral Cortex, 6, 81–92.

    Article  Google Scholar 

  • Frolov, A.A. & Muraviev, I.P. (1993). Informational characteristics of neural networks capable of associative learning based on Hebbian plasticity. Network, 4, 495--536.

    Article  MATH  Google Scholar 

  • Frolov, A.A., Husek, D. & Muraviev, I.P. (1997). Informational capacity and recal quality in sparsely encoded Hopfield-like neural network: Analytical approaches and computer simulation. Neural Networks, 10, 845–855.

    Article  Google Scholar 

  • Gifi, A. (1990). Nonlinear Multivariate Analysis. New York: John Wiley & Sons.

    MATH  Google Scholar 

  • Hopfield, J.J. (1982). Neural network and physical systems with emergentcollective computational abilities. Proc. Natl. Acad. Sci. USA, 79, 2544–2548.

    Article  MathSciNet  Google Scholar 

  • Marr, D. (1970). A theory of cerebral neocortex. Proc R Soc Lond, B, 176, 161–234.

    Article  Google Scholar 

  • Marr, D. (1971). Simple memory. A theory of archicortex. Phil Trans R Soc Lond, B, 262, 24–81.

    Google Scholar 

  • McDonald, R.P. (1985). Factor Analysis and Related Methods. New Jersey: Lawrence Erlbaum Associates, Publisher.

    Google Scholar 

  • Perez-Vicente, C.J. & Amit, D. (1989). Optimized network for sparsely encoded patterns. J. of Physics A: Math. Gen., 22, 559–569.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Husek, D., Frolov, A.A., Rezankova, H., Snasel, V. (2002). Application of Hopfield-like Neural Networks to Nonlinear Factorization. In: Härdle, W., Rönz, B. (eds) Compstat. Physica, Heidelberg. https://doi.org/10.1007/978-3-642-57489-4_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-57489-4_22

  • Publisher Name: Physica, Heidelberg

  • Print ISBN: 978-3-7908-1517-7

  • Online ISBN: 978-3-642-57489-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics