Abstract
In 1984 Hopfield showed that the time evolution of a symmetric Hopfield neural networks are a motion in state space that seeks out minima in the energy function (i.e., equilibrium point set of Hopfield neural networks). Because high-order Hopfield neural networks have more extensive applications than Hopfield neural networks, the paper will discuss the convergence of high-order Hopfield neural networks. The obtained results ensure that high-order Hopfield neural networks ultimately converge to the equilibrium point set. Our result cancels the requirement of symmetry of the connection weight matrix and includes the classic result on Hopfield neural networks, which is a special case of high-order Hopfield neural networks. In the end, A example is given to verify the effective of our results.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Hopfield, J.J.: Neurons with Graded Response Have Collective Computational Properties Like those of Two-state Neurons. Porc. Natl. Acad. Sci. 81, 3088–3092 (1984)
Dembo, A., Farotimi, O., Kailath, T.: High-Order Absolutely Stable Neural Networks. IEEE Trans. Circ. Syst. II 38, 57–65 (1991)
Kosmatopoulos, E.B., Polycarpou, M.M., Christodoulou, M.A., et al.: High-Order Neural Networks Structures for Identification of Dynamical Systems. IEEE Trans. Neural Networks 6, 422–431 (1995)
Zhang, T., Ge, S.S., Hang, C.C.: Neural-Based Direct Adaptive Control for a Class of General Nonlinear Systems. International Journal of Systems Science 28, 1011–1020 (1997)
Su, J., Hu, A., He, Z.: Solving a Kind of Nonlinear Programming Problems via Analog Neural Networks. Neurocomputing 18, 1–9 (1998)
Stringera, S.M., Rollsa, E.T., Trappenbergb, T.P.: Self-organising Continuous Attractor Networks with Multiple Activity Packets, and the Representation of Space. Neural Networks 17, 5–27 (2004)
Xu, B.J., Liu, X.Z., Liao, X.X.: Global Asymptotic Stability of High-Order Hopfield Type Neural Networks with Time Delays. Computers and Mathematics with Applications 45, 1729–1737 (2003)
Sun, C., Zhang, K., Fei, S., Feng, C.B.: On Exponential Stability of Delayed Neural Networks with a General Class of Activation Functions. Physics Letters A 298, 122–132 (2002)
Sun, C., Feng, C.B.: Exponential Periodicity of Continuous-Time and Discrete- Time Neural Networks with Delays. Neural Processing Letters 19, 131–146 (2004)
Sun, C., Feng, C.B.: On Robust Exponential Periodicity of Interval Neural Networks with Delays. Neural Processing Letters 20, 53–61 (2004)
Cao, J.: On Exponential Stability and Periodic Solution of CNN’s with Delay. Physics Letters A 267, 312–318 (2000)
Liao, X., Chen, G., Sanchez, E.: Delay-dependent Exponential Stability Analysis of Delayed Neural Networks: an LMI approach. Neural Networks 15, 855–866 (2002)
Cao, J., Wang, J.: Global Asymptotic Stability of Recurrent Neural Networks with Lipschitz-continuous Activation Functions and Time-Varying Delays. IEEE Trans. Circuits Syst. I 50, 34–44 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Shen, Y., Zong, X., Jiang, M. (2005). High-Order Hopfield Neural Networks. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_36
Download citation
DOI: https://doi.org/10.1007/11427391_36
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25912-1
Online ISBN: 978-3-540-32065-4
eBook Packages: Computer ScienceComputer Science (R0)