Skip to main content

High-Order Hopfield Neural Networks

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3496))

Abstract

In 1984 Hopfield showed that the time evolution of a symmetric Hopfield neural networks are a motion in state space that seeks out minima in the energy function (i.e., equilibrium point set of Hopfield neural networks). Because high-order Hopfield neural networks have more extensive applications than Hopfield neural networks, the paper will discuss the convergence of high-order Hopfield neural networks. The obtained results ensure that high-order Hopfield neural networks ultimately converge to the equilibrium point set. Our result cancels the requirement of symmetry of the connection weight matrix and includes the classic result on Hopfield neural networks, which is a special case of high-order Hopfield neural networks. In the end, A example is given to verify the effective of our results.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hopfield, J.J.: Neurons with Graded Response Have Collective Computational Properties Like those of Two-state Neurons. Porc. Natl. Acad. Sci. 81, 3088–3092 (1984)

    Article  Google Scholar 

  2. Dembo, A., Farotimi, O., Kailath, T.: High-Order Absolutely Stable Neural Networks. IEEE Trans. Circ. Syst. II 38, 57–65 (1991)

    Article  MATH  Google Scholar 

  3. Kosmatopoulos, E.B., Polycarpou, M.M., Christodoulou, M.A., et al.: High-Order Neural Networks Structures for Identification of Dynamical Systems. IEEE Trans. Neural Networks 6, 422–431 (1995)

    Article  Google Scholar 

  4. Zhang, T., Ge, S.S., Hang, C.C.: Neural-Based Direct Adaptive Control for a Class of General Nonlinear Systems. International Journal of Systems Science 28, 1011–1020 (1997)

    Article  MATH  Google Scholar 

  5. Su, J., Hu, A., He, Z.: Solving a Kind of Nonlinear Programming Problems via Analog Neural Networks. Neurocomputing 18, 1–9 (1998)

    Article  Google Scholar 

  6. Stringera, S.M., Rollsa, E.T., Trappenbergb, T.P.: Self-organising Continuous Attractor Networks with Multiple Activity Packets, and the Representation of Space. Neural Networks 17, 5–27 (2004)

    Article  Google Scholar 

  7. Xu, B.J., Liu, X.Z., Liao, X.X.: Global Asymptotic Stability of High-Order Hopfield Type Neural Networks with Time Delays. Computers and Mathematics with Applications 45, 1729–1737 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  8. Sun, C., Zhang, K., Fei, S., Feng, C.B.: On Exponential Stability of Delayed Neural Networks with a General Class of Activation Functions. Physics Letters A 298, 122–132 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  9. Sun, C., Feng, C.B.: Exponential Periodicity of Continuous-Time and Discrete- Time Neural Networks with Delays. Neural Processing Letters 19, 131–146 (2004)

    Article  MathSciNet  Google Scholar 

  10. Sun, C., Feng, C.B.: On Robust Exponential Periodicity of Interval Neural Networks with Delays. Neural Processing Letters 20, 53–61 (2004)

    Article  Google Scholar 

  11. Cao, J.: On Exponential Stability and Periodic Solution of CNN’s with Delay. Physics Letters A 267, 312–318 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  12. Liao, X., Chen, G., Sanchez, E.: Delay-dependent Exponential Stability Analysis of Delayed Neural Networks: an LMI approach. Neural Networks 15, 855–866 (2002)

    Article  Google Scholar 

  13. Cao, J., Wang, J.: Global Asymptotic Stability of Recurrent Neural Networks with Lipschitz-continuous Activation Functions and Time-Varying Delays. IEEE Trans. Circuits Syst. I 50, 34–44 (2003)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Shen, Y., Zong, X., Jiang, M. (2005). High-Order Hopfield Neural Networks. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_36

Download citation

  • DOI: https://doi.org/10.1007/11427391_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-25912-1

  • Online ISBN: 978-3-540-32065-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics