Abstract
In 1984 Hopfield showed that the time evolution of a symmetric Hopfield neural networks are a motion in state space that seeks out minima in the energy function (i.e., equilibrium point set of Hopfield neural networks). Because high-order Hopfield neural networks have more extensive applications than Hopfield neural networks, and have been discussed on the convergence of the networks. In practice, a neural network is often subject to environmental noise. It is therefore useful and interesting to find out whether the high-order neural network system still approacher some limit set under stochastic perturbation. In this paper, we will give a number of useful bounds for the noise intensity under which the stochastic high-order neural network will approach its limit set. Our result cancels the requirement of symmetry of the connection weight matrix and includes the classic result on Hopfield neural networks, which is a special case of stochastic high-order Hopfield neural networks. In the end, A example is given to verify the effective of our results.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Hopfield, J.J.: Neurons with Graded Response Have Collective Computational Properties Like those of Two-state Neurons. Porc. Natl. Acad. Sci. USA 81, 3088–3092 (1984)
Dembo, A., Farotimi, O., Kailath, T.: High-Order Absolutely Stable Neural Networks. IEEE Trans. Circ. Syst. II 38, 57–65 (1991)
Kosmatopoulos, E.B., Polycarpou, M.M., Christodoulou, M.A., et al.: High-Order Neural Networks Structures for Identification of Dynamical Systems. IEEE Trans. Neural Networks 6, 422–431 (1995)
Zhang, T., Ge, S.S., Hang, C.C.: Neural-Based Direct Adaptive Control for a Class of General Nonlinear Systems. International Journal of Systems Science 28, 1011–1020 (1997)
Su, J., Hu, A., He, Z.: Solving a Kind of Nonlinear Programming Problems via Analog Neural Networks. Neurocomputing 18, 1–9 (1998)
Stringera, S.M., Rollsa, E.T., Trappenberg, T.P.: Self-organising Continuous Attractor Networks with Multiple Activity Packets, and the Representation of Space. Neural Networks 17, 5–27 (2004)
Xu, B.J., Liu, X.Z., Liao, X.X.: Global Asymptotic Stability of High-Order Hopfield Type Neural Networks with Time Delays. Computers and Mathematics with Applications 45, 1729–1737 (2003)
Sun, C., Feng, C.B.: Exponential Periodicity of Continuous-Time and Discrete-Time Neural Networks with Delays. Neural Processing Letters 19, 131–146 (2004)
Sun, C., Feng, C.B.: On Robust Exponential Periodicity of Interval Neural Networks with Delays. Neural Processing Letters 20, 53–61 (2004)
Cao, J.: On Exponential Stability and Periodic Solution of CNN’s with Delay. Physics Letters A 267, 312–318 (2000)
Cao, J., Wang, J.: Global Asymptotic Stability of Recurrent Neural Networks with Lipschitz-continuous Activation Functions and Time-Varying Delays. IEEE Trans. Circuits Syst. I 50, 34–44 (2003)
Shen, Y., Zong, X.J., Jiang, M.H.: High-Order Hopfield Neural Networks. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3496, pp. 235–240. Springer, Heidelberg (2005)
Mao, X.: Exponential Stability of Stochastic Differential Equations. Marcel Dekker, New York (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Shen, Y., Zhao, G., Jiang, M., Hu, S. (2005). Stochastic High-Order Hopfield Neural Networks. In: Wang, L., Chen, K., Ong, Y.S. (eds) Advances in Natural Computation. ICNC 2005. Lecture Notes in Computer Science, vol 3610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11539087_98
Download citation
DOI: https://doi.org/10.1007/11539087_98
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28323-2
Online ISBN: 978-3-540-31853-8
eBook Packages: Computer ScienceComputer Science (R0)