Skip to main content

Loading Temporal Associative Memory Using the Neuronic Equation

  • Conference paper
  • First Online:
Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003 (ICANN 2003, ICONIP 2003)

Abstract

We discuss the loading capacity of the neuronic equation for temporal associative memory. We show explicitly how to synthesize a perfect temporal associative memory using a network of such neurons, where all non-linear aspects can be linearized in tensorial space.

Supported by National Science Council NSC 91-2213-E-002-124

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. S.I. Amari (1972), Learning patterns and pattern sequences by self-organising nets, IEEE Trans Comput, 21, 1197–1206

    Article  MATH  MathSciNet  Google Scholar 

  2. E.R. Caianiello (1973), Some remarks on the tensorial linearization of general and l.s. Boolean functions, Kybernetik, 12, 90

    MathSciNet  Google Scholar 

  3. E.R. Caianiello (1984), Neuronic equations revisited and completely solved, Proceedings of the First Meeting on Brain Theory, October 1–4, at the International Centre for Theoretical Physics in Trieste, Italy.

    Google Scholar 

  4. E.R. Caianiello and M. Marinaro (1986), Linearization and Synthesis of Cellular Automata: The Additive Case, Physica Scripta, 34, 444–448

    Article  MATH  MathSciNet  Google Scholar 

  5. J. J. Hopfield (1984), Neurons with graded response have collective computational properties like those of two-state neurons, Proc. Natl. Acad. Sci., 81, 3088–3092

    Article  Google Scholar 

  6. D.C. Ince (1992), Intelligent Machinery, Alan Turing in Collected Works of A.M. Turing: Mechanical Intelligence, Edited by D.C. Ince, Elsevier Science Publishers

    Google Scholar 

  7. Y.C. Lee, S. Qian, R.D. Jones, C.W. Barnes, G.W. Flake, M.K. O’rourke, K. Lee, H.H. Chen, G.Z. Sun, Y.Q. Zhang, D. Chen, C.L. Giles (1990), Adaptive stochastic cellular automata: Theory, Physica D, 45, 159–180

    Article  MATH  MathSciNet  Google Scholar 

  8. C.-Y. Liou, S.-L. Lin (1989), The Other Variant Boltzmann Machine, International Joint Conference on Neural Networks, Washington D.C., 449–454.

    Google Scholar 

  9. C.-Y. Liou, H.-T. Chen, and Jau-Chi Huang (2000), Separation of internal representations of the hidden layer, Proceedings of the International Computer Symposium, Workshop on Artifical Intelligence, December 6–8, National Chung Cheng University, Chiayi

    Google Scholar 

  10. C.-Y. Liou, S.-L. Yuan (1999), Error Tolerant Associative Memory, Biological Cybernetics, 81, 331–342

    Article  MATH  Google Scholar 

  11. F.C. Richards, P. T. Meyer, N. H. Packard (1990), Extracting cellular automaton rules directly from experimental data, Physica D, 45, 189–202.

    Article  MATH  Google Scholar 

  12. I. Kanter, H. Sompolinsky (1987), Associative Recall of Memory without Errors, Physical Review A, 35(1), 380–392

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Liou, CY., Sou, UC. (2003). Loading Temporal Associative Memory Using the Neuronic Equation. In: Kaynak, O., Alpaydin, E., Oja, E., Xu, L. (eds) Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003. ICANN ICONIP 2003 2003. Lecture Notes in Computer Science, vol 2714. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44989-2_7

Download citation

  • DOI: https://doi.org/10.1007/3-540-44989-2_7

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40408-8

  • Online ISBN: 978-3-540-44989-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics