Elsevier

Neural Networks

Volume 13, Issue 10, December 2000, Pages 1135-1143
Neural Networks

Contributed article
On stability of nonlinear continuous-time neural networks with delays

https://doi.org/10.1016/S0893-6080(00)00076-9Get rights and content

Abstract

We utilize the Lyapunov function method to analyze stability of continuous nonlinear neural networks with delays and obtain some new sufficient conditions ensuring the globally asymptotic stability independent of delays. Three main conditions imposed on the weighting matrices are established. (i). The spectral radius ρ(M−1(|W0|+|Wτ|)K)<1. (ii). The row norm ‖M−1(|W0|+|Wτ|)K+P−1((|W0|+|Wτ|)KM−1)TP<2. (iii). μ2(W0)+‖Wτ2,F<(m/k). These three conditions are independent to each other. The delayed Hopfield network, Bidirectional associative memory network and cellular neural network are special cases of the network model considered in this paper. So we improve some previous works of other researchers.

Introduction

Continuous-time analog neural networks with symmetric connections will always converge to fixed points when the inter-neuronal transmission delays are ignored (Chua & Yang, 1988, Hopfield, 1984), but may become unstable when time delays are considered. So it is significant to investigate stability conditions of neural networks with delays.

For neural networks without delays, some conditions imposed on connection weight matrices have been derived by many researchers to ensure the stability of the networks (see, for example, Hirch, 1989, Kelly, 1990, Matsuoka, 1992). These conditions are mostly given by matrix norms or matrix measures. For neural networks with delays, however, it is more difficult to analyze their stability properties due to introduction of delays. There are usually two ways to do this. One is to linearize the system near an equilibrium, the original system has the same stability properties as the linearized system near the equilibrium considered, conditions obtained by this way concern the local stability around an equilibrium. Another way is to construct a suitable Lyapunov function for the system and then to derive sufficient conditions ensuring stability, this usually involves global stability. However, construction of a suitable Lyapunov function is usually not an easy task. For detailed theory of stability analysis for systems with delays, one can consult some tutorial books, e.g. Hale, 1977, Qin et al., 1989. Marcus and Westervelt (1989) studied stability of analog neural networks with delay by linearizing the systems. Civalleri et al., 1993, Gilli, 1994 have established some sufficient conditions for delay-dependent stability of cellular neural networks (CNNs) with delay using the Lyapunov functional approach. Gopalsamy & He, 1994a, Gopalsamy & He, 1994b have obtained some sufficient conditions for globally asymptotic stability of delayed bidirectional associative memory networks (BAM) and Hopfield networks, respectively. In this paper we will construct different Lyapunov functions to obtain some sufficient conditions for delay-independent globally asymptotic stability of a class of continuous-time continuous-state neural networks which includes delayed Hopfield networks, BAM networks and CNNs as its special cases. Our conditions relaxed some of the restrictions on the weight matrices and so improved some results developed by other authors. Our condition is without requirement of symmetry of the weight matrices.

Section snippets

Model description

In this paper, we deal with delayed continuous-time neural networks described by the following differential equations with delaysdui(t)dt=−gi(ui(t))+j=1nwij0fj(uj(t))+j=1nwijτfj(uj(t−τj))+Iii=1,2,…,nwhere wij0,wijτ,Ii,τj are constant real numbers, τj represents delays which are nonnegative. Suppose further the following assumptions are satisfied

  • (A1) gi:RR is differentiable and strictly monotone increasing, i.e. mi=infx∈R{g′i(x)}>0, i=1,2,…,n, where gi(x) represents the derivative of gi(x).

Sufficient conditions for global stability

In this section we will derive some sufficient conditions for globally asymptotic stability of the neural network model (1), globally asymptotic stability means global stability independent of delays.

Theorem 1

For neural network model (1), if the spectral radius of the matrix M−1(|W0|+|Wτ|)K is less than 1, i.e. ρ(M−1(|W0|+|Wτ|)K)<1, then there exists at most one equilibrium for Eq. (1) and when Eq. (1) does have an equilibrium, the equilibrium is globally asymptotically stable.

Proof

Note that the equilibrium

Conclusion

Some sufficient conditions for globally asymptotic stability independent of delays for a kind of continuous-time continuous-state nonlinear neural networks with delays have been obtained. The network model considered here is general and includes delayed Hopfield-type networks, BAM networks and CNNs as its special cases. Some of our results are improvements on previous works established by other researchers.

Acknowledgements

The author wishes to thank the anonymous references and editors for their valuable suggestions. This work is supported by the National Natural Science Foundation of China and Center for Software and Theory of Universities in Shanghai.

References (17)

There are more references available in the full text version of this article.

Cited by (128)

  • Dynamics of solution for a class of delayed diffusive neural networks with mixed boundary conditions

    2010, Neurocomputing
    Citation Excerpt :

    □ By comparison with the earlier works on neural networks with or without diffusion, we will find that they discussed constant equilibrium point and its asymptotic behavior [4–22]. In this paper, we first investigate the nonconstant equilibrium solution and its properties.

View all citing articles on Scopus
View full text