Skip to main content

Comparative Analysis of Learning Methods of Cellular-Neural Associative Memory

  • Conference paper
  • First Online:
Parallel Computing Technologies (PaCT 1999)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1662))

Included in the following conference series:

Abstract

In this paper various methods of CNAM learning (synthesis) are compared in order to find their common features. This allows to transfer the important characteristics among the methods, and to do some assumptions about their capabilities. Also the influence of learning parameters in some methods on the CNAM stability is investigated, and recommendations on their choice are given.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Leon O. Chua. CNN: a Paradigm for Complexity. World Scientific, Series on Nonlinear Science, Vol. 31.

    Google Scholar 

  2. O. L. Bandman. Cellular-Neural Computations, Formal Model and Possible Applications. Lecture Notes in Computer Science, 964, 1995, p. 21–35.

    Google Scholar 

  3. J. J. Hopfield, D. W. Tank. Computing with Neural Circuits: a Model. Science, Vol. 233, 1986, p. 625.

    Article  Google Scholar 

  4. J. Zhang, Li Zhang, D. Yan, A. He, L. Liu. Local Interconnection Neural Network and its Optical Implementation. Optics Communication, Vol. 102, 1993, pp. 13–20.

    Article  Google Scholar 

  5. H. Harrer, J. A. Nossek. Discrete-time Cellular Neural Networks. Int. j. c. th. appl. 20, 453 (1992).

    Google Scholar 

  6. E. Pessa, C. Palma, M. Penna. Cellular Neural Networks for Realizing Associative Memories. Proceedings of the Second Conference on Cellular Automata for Research and Industry, Milan, Italy, 16-18 October 1996, pp. 127–134.

    Google Scholar 

  7. L. Perzonas, I. Guyon, G. Dreyfus. Collective Computational Properties of Neural Networks: New Learning Mechanism. Physical Review, A, vol 34, November, 1986, p. 4217–4228.

    Article  MathSciNet  Google Scholar 

  8. S. G. Pudov. Cellular-Neural Associative Memory Learning. Master’s Thesis in Mathematics (Novosibirsk State University, 1997).

    Google Scholar 

  9. D. O. Gorodnichy. Desaturating Coefficient for Projection Learning Rule. Lecture Notes in Computer Science, 1112, 1996, p. 469–476.

    Google Scholar 

  10. J. Li, A. N. Michel, W. Porod. Analysis and Synthesis of a Class of Neural Networks: Linear Systems Operating on a Closed Hypercube. IEEE Transactions on Circuits and Systems, v. 36, N. 11, 1989, p. 1405–1422.

    Article  MATH  MathSciNet  Google Scholar 

  11. D. Liu, A. N. Michel. Sparsely Interconnected Neural Networks for AssociativeMemories With Applications to Cellular Neural Networks. IEEE Transactions on Circuits and Systems-II: Analog and Digital Signal Processing, v. 41, N. 4, 1994, p. 295–307.

    Article  MATH  MathSciNet  Google Scholar 

  12. A. Michel, K. Wang, D. Liu, H. Ye. Qualitative Limitations Incurred in Implementations of Recurrent Neural Networks. IEEE Control Systems, June 1995, pp. 52–65.

    Google Scholar 

  13. G. Yen, A. N. Michel. A Learning and Forgetting Algorithm in Associative Memories: The Eigenstructure Method. IEEE Transactions on Circuit and Systems-II: Analog and Digital Signal Processing, vol. 39, pp. 212–225, Apr. 1992.

    Google Scholar 

  14. F. Rosenblatt. Principles of Neurodynamics. Washington, Spartan, 1959.

    Google Scholar 

  15. S. G. Pudov. Cellular-Neural Associative Memory Learning. Optoelectronics, Instrumentation and Data Processing, no 2, 1997, pp. 98–110.

    Google Scholar 

  16. M. Cottrell. Stability and Attractivity in Associative Memory Networks. Biological Cybernetics 58, 1988, p. 129–139.

    Article  MATH  MathSciNet  Google Scholar 

  17. S. G. Pudov. Influence of Self-Connection Weights on Cellular-Neural Network Stability. Lecture Notes in Computer Science, 1277, p. 76-82.

    Google Scholar 

  18. O. L. Bandman, S. G. Pudov. Stability of stored patterns in cellular-neural associative memory. Bulletin of the Novosibirsk Computer Center. Series: Computer Science, issue 4, 1996, p. 1–16.

    Google Scholar 

  19. D. Liu, Z. Lu. A New Synthesis Approach for Feedback Neural Networks Based on the Perceptron Training Algorithm. IEEE Trans. Neural Networks, Vol. 8, pp. 1468–1482, Nov. 1997.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pudov, S. (1999). Comparative Analysis of Learning Methods of Cellular-Neural Associative Memory. In: Malyshkin, V. (eds) Parallel Computing Technologies. PaCT 1999. Lecture Notes in Computer Science, vol 1662. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48387-X_12

Download citation

  • DOI: https://doi.org/10.1007/3-540-48387-X_12

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-66363-8

  • Online ISBN: 978-3-540-48387-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics