Skip to main content
Log in

Gradient Computation of Continuous-Time Cellular Neural/Nonlinear Networks with Linear Templates via the CNN Universal Machine

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Single-layer, continuous-time cellular neural/nonlinear networks (CNN) are considered with linear templates. The networks are programmed by the template-parameters. A fundamental question in template training or adaptation is the gradient computation or approximation of the error as a function of the template parameters. Exact equations are developed for computing the gradients. These equations are similar to the CNN network equations, i.e. they have the same neighborhood and connectivity as the original CNN network. It is shown that a CNN network, with a modified output function, can compute the gradients. Thus, fast on-line gradient computation is possible via the CNN Universal Machine, which allows on-line adaptation and training. The method for computing the gradient on-chip is investigated and demonstrated.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Chua, L. O. and Yang, L.: Cellular neural networks: theory, IEEE Trans. on Circuits and Systems, (CAS), 35 (1988), 1257–1272.

    Google Scholar 

  2. Zarándy, Á.: The Art of CNN template design, Int. J. Circuit Theory and Applications-Special Issue: Theory, Design and Applications of Cellular Neural Networks: Part II: Design and Applications, (CTA Special Issue-II), 17(1) (1999), 5–24.

    Google Scholar 

  3. Zarándy, Á., Stoffels, A., Roska, T., Werblin, F. and Chua, L. O.: Implementation of binary and grayscale mathematical morphology on the CNN universal machine, IEEE Trans. Circuits System-I, 45(2) (1998), 163–168.

    Google Scholar 

  4. Hanggi, M. and Moschytz, G. S.: An exact and direct analytical method for the design of optimally robust CNN templates, IEEE Trans. on Circuits and Systems I: Special Issue on Bio-Inspired Processors and Cellular Neural Networks for Vision, (CAS-I Special Issue), 46(2) (1999), 304–311.

    Google Scholar 

  5. Nemes, L., Chua, L. O. and Roska, T.: Implementation of Arbitrary Boolean Functions on the CNN Universal Machine, Int. J. Circuit Theory and Applications-Special Issue: Theory, Design and Applications of Cellular Neural Networks: Part I: Theory, (CTA Special Issue-I), 26(6), guest ed: T. Roska, A. Rodriguez-Vazquez and J. Vandewalle, 1998, pp. 593–610.

    Google Scholar 

  6. Crounse, K. R., Fung, E. L. and Chua, L. O.: Efficient implementation of neighborhood logic for cellular neural network universal machine, IEEE Trans. Circuits System-I, 44(4) (1997), 255–361.

    Google Scholar 

  7. Zou, F., Schwarz, S. and Nosek, J. A.: Cellular neural network design using a learning algorithm, Proceedings of the First International Workshop of Cellular Neural Networks and their Applications (CNNA-90), 1990, pp. 73–82.

  8. Nossek, J. A.: Design and learning with cellular neural networks, Proceedings of the Third International Workshop of Cellular Neural Networks and their Applications (CNNA-94), 1994, pp. 137–146.

  9. Nossek, J. A.: Design and Learning with Cellular Neural Networks, Microsystems Technology for Multimedia Applications: An Introduction, (Multimedia), Bing Sheu (ed), IEEE Press, 0–7803–1157–4, (1995), 395–404.

  10. Pineda, F. J.: Generalization of back-propagation to recurrent neural networks, Physical Review Letters, 1 (Nov. 1987), 2229–2232.

  11. Kozek, T., Roska, T. and Chua, L. O.: Genetic algorithm for CNN template learning, IEEE Trans. on Circuits and Systems I: Fundamental Theory and Applications, 40(6) (1993), 392–402.

    Google Scholar 

  12. Szirányi, T. and Csapodi, M.: Texture classification and segmentation by cellular neural network using genetic learning, Computer Vision and Image Understanding, (CVIU), 71(3) (1998), 255–270.

    Google Scholar 

  13. Chandler, B., Rekeczky, Cs., Nishio, Y. and Ushida, A.: Adaptive simulated annealing in CNN template learning, IEICE Trans. on Fundamentals of Electronics, Communications and Computer Sciences, E82–A, (2), (1999), 398–402.

    Google Scholar 

  14. Pang, X. and Werbos, P.: Neural network design for J function approximation in dynamic programming, Journal on Mathematical Modeling and Scientific Computing (Principia Scientia), special issue on neural networks, planned as (1), 1997.

  15. Tetzlaff, R. and Wolf, D.: A Learning Algorithm for the Dynamics of CNN with Nonlinear Templates-Part II: Discrete-Time Case, Proceedings of the Forth International Workshop of Cellular Neural Networks and their Applications (CNNA 96), 1996, pp. 461–466.

  16. Wan, E. and Beaufays, F.: Network Reciprocity: A Simple Approach to Derive Gradient Algorithms for Arbitrary Neural Network Structures, Proc. WCNN'94, 3, San Diego, CA, (6–9 June 1994), 382–389.

  17. Wan, E. and Beaufays, F.: Diagrammatic derivation of gradient algorithms for neural networks, Neural Computation, 8(1) (1996), 182–201.

    Google Scholar 

  18. Beaufays, F. and Wan, E.: A unified approach to derive gradient algorithms for arbitrary neural network structures, Proc. ICANN'94, 1, Sorrento, Italy, (26–29 May 1994), 545–548.

  19. Chua, L. O. and Roska, T.: The CNN paradigm, IEEE Trans. on Circuits and Systems I: Fundamental Theory and Applications, 40(3) (1993), 147–156.

    Google Scholar 

  20. Roska, T. and Chua, L. O.: The CNN universal machine: an analogic array computer, IEEE Trans. on Circuits and Systems II: Analog and Digital Signal Processing, 40(3) (1993), 163–173.

    Google Scholar 

  21. Puffer, F., Tetzlaff, R. and Wolf, D.: A Learning Algorithm for Cellular Neural Networks Solving Partial Differential Equations, ISSSE 95, San Francisco, 1995.

  22. Yang, T. and Chua, L. O.: Implementing back-propagation-trough-time algorithm using cellular neural networks, Int. Journ. of Bifurcation and Chaos, 9(6) (1999), 1041–1074.

    Google Scholar 

  23. Magnussen, H. J. and Nossek, A.: Continuation-based learning algorithm for Discrete-Time Cellular Networks, Proceedings of the Third International Workshop of Cellular Neural Networks and their Applications (CNNA-94), 1994, pp. 171–176.

  24. Chua, L. O. and Roska, T.: Cellular Neural Networks: Foundation and primer, Lecture notes for course EE129 at U.C. Berkeley, 1999.

  25. Roska, T., Kék, L., Nemes, L., Zarándy, Á., Brendel, M. and Szolgay, P. (eds.): CNN Software Library (Templates and Algorithms) Version 7.2, Research Report (DNS-1–1998), Analogical and Neural Computing Laboratory, MTA SZTAKI, Budapest 1998.

    Google Scholar 

  26. Magnussen, H. and Nossek, J. A.: A geometric approach to properties of the discrete-time cellular neural network, IEEE Trans. on Circuits and Systems I: Fundamental theory and applications, 41 (1994), 625–634.

    Google Scholar 

  27. Roska, T.: Space Variant Adaptive CNN-Fault Tolerance and Plasticity, Proceedings of International Symposium on Nonlinear Theory and Applications (NOLTA'97), 201–204, Hawaii, 1997.

  28. Földesy, P., Kék, L., Roska, T., Zarándy, Á., Roska, T. and Bártfai, G.: Fault Tolerant Design of Analogic CNN Templates and Algorithms-Part I: The Binary Output Case, IEEE Trans. on Circuits and Systems I: Special Issue on Bio-Inspired Processors and Cellular Neural Networks for Vision, 46(2) (1999), 312–322.

    Google Scholar 

  29. Roska, T.: Computer-Sensors: spatial-temporal computers for analog array signals, dynamically integrated with sensors, Journal of VLSI Signal Processing Special Issue: Spatiotemporal Signal Processing with Analogic CNN Visual Microprocessors, (JVSP Special Issue), 23(2/3), (November/December 1999), Kluwer 221–238.

    Google Scholar 

  30. Espejo, S., Dominguez-Castro, R., Linan, G. and Rodriguez-Vázquez, A.: A 64×64 CNN Universal Chip with Analog and Digital I/O, Proceedings of 5th IEEE International Conference on Electronics, Circuits and Systems, (ICECS'98), 203–206, Lisboa, 1998.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Brendel, M., Roska, T. & Bártfai, G. Gradient Computation of Continuous-Time Cellular Neural/Nonlinear Networks with Linear Templates via the CNN Universal Machine. Neural Processing Letters 16, 111–120 (2002). https://doi.org/10.1023/A:1019933009505

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1019933009505

Navigation