Elsevier

Neural Networks

Volume 19, Issue 1, January 2006, Pages 76-83
Neural Networks

A new approach to exponential stability analysis of neural networks with time-varying delays

https://doi.org/10.1016/j.neunet.2005.05.005Get rights and content

Abstract

This paper considers the problem of exponential stability analysis of neural networks with time-varying delays. The activation functions are assumed to be globally Lipschitz continuous. A linear matrix inequality (LMI) approach is developed to derive sufficient conditions ensuring the delayed neural network to have a unique equilibrium point, which is globally exponentially stable. The proposed LMI conditions can be checked easily by recently developed algorithms solving LMIs. Examples are provided to demonstrate the reduced conservativeness of the proposed results.

Introduction

In recent years, artificial neural networks have been widely studied due to their extensive applications in classification of patterns, associative memories, image processing, quadratic optimization, and other areas (Borkar and Soumyanatha, 1997, Chua and Yang, 1988, Cichocki and Unbehauen, 1993, Michel and Liu, 2002). In implementation of artificial neural networks, however, time delays are unavoidably encountered (Hopfield, 1984). It has been found that, the existence of time delays may lead to instability and oscillation in a neural network. Therefore, stability analysis of neural networks with time delays has received much attention; see, e.g. (Cao, 2001, Liao and Wang, 2000, Michel and Liu, 2002, Van Den Driessche and Zou, 1998), and the references therein.

In stability analysis of delayed neural networks, the qualitative properties primarily concerned are uniqueness, global asymptotic stability and global exponential stability of their equilibrium point. In the case when the time delay is constant, a sufficient condition ensuring the uniqueness and global asymptotic stability of an equilibrium point of delayed neural networks with monotonic non-decreasing continuous activation functions was proposed in Arik (2003); this was shown to be less conservative than some earlier asymptotic stability results in Cao, 2001, Liao and Wang, 2000. When time delays are time-varying and the upper bound of the derivatives of the delays are less than one, some asymptotic stability conditions were provided in Joy (2000), while in Hou and Qian, 1998, Xu et al., 2001, via different approaches, asymptotic stability results were obtained for neural networks with time-varying delays which are bounded but not required to be differentiable. On the other hand, in the design of neural networks, one is not only interested in global stability, but also in some other performances. One of such important performances is the global exponential stability, which guarantees that, whatever transformation occurs, the network's ability to store rapidly the activity pattern is left invariant by self-organization (Liao, Chen & Sanchez, 2002). Therefore, exponential stability for delayed neural networks has been analyzed by many researchers.

In the context of differentiable time-varying delay case, sufficient conditions for exponential stability were derived in Liao et al. (2002), where the Lyapunov–Krasovskii method was developed. However, it was shown in Arik (2004) that the results in Liao et al. (2002) may not be correct in some cases, and thus they are corrected and some new sufficient conditions were presented in Arik (2004). It is noted that in both Arik, 2004, Liao et al., 2002 the uniqueness of the equilibrium point has not been established. For neural networks with continuous bounded delays, exponential stability results can be found in Zhang, 2003, Zhou and Cao, 2002, respectively, for different types of activation functions. It is worth pointing out here that the stability conditions proposed in Arik (2004) are not linear matrix inequalities (LMIs) with respect to the parameters to be determined; they are not easy to check.

In this paper, we are concerned with the problem of exponential stability analysis for neural networks with time-varying delays. The activation functions are assumed to be globally Lipschitz continuous; they are more general than those satisfying monotonic non-decreasing continuous, which is usually assumed in the analysis of neural networks. Firstly, for time-varying delays which are differentiable and the upper bound of the derivatives of the delays are less than one, a new approach is developed to obtain a sufficient condition, which guarantees the existence, uniqueness and global exponential stability of an equilibrium point of such kind of delayed neural networks. This condition is less conservative than those in Arik (2004), which is demonstrated via a numerical example. Also, it is shown that all the results in Arik (2004) are equivalent to some delay-independent ones. Secondly, for time-varying delays which are bounded but unnecessarily differentiable, a sufficient condition is proposed via a similar approach, which is less conservative than those obtained in Zhang, 2003, Zhou and Cao, 2002 illustrated via a numerical example. Under this condition, the existence, uniqueness and global exponential stability of an equilibrium point of this kind of delayed neural networks is established. In both cases, the conditions are expressed in terms of LMIs, which can be checked numerically very efficiently by resorting to recently developed interior-point methods, and no tuning of parameters will be involved (Boyd, El Ghaoui, Feron & Balakrishnan, 1994).

Throughout this paper, for real symmetric matrices X and Y, the notation XY (respectively, X>Y) means that the matrix XY is positive semi-definite (respectively, positive definite). The superscript ‘T’ represents the transpose. We use λmin(·) and λmax(·) to denote the minimum and maximum eigenvalue of a real symmetric matrix, respectively. The notation ‖x‖ denotes a vector norm defined by||x||=(i=1nxi2)1/2when x is a vector. For a matrix A, ‖A‖ denotes the spectral norm defined by ‖A‖=(λmax(ATA))1/2, while |A| denotes a matrix whose (i, j) entry is |aij|, i.e. |A|={|aij|}. ρ(·) denotes the spectral radius of a matrix. Matrix dimensions, if not explicitly stated, are assumed to be compatible for algebraic operations.

Section snippets

Problem formulation

Consider the following neural network with a time-varying delay described by a non-linear delay differential equation of the formu˙(t)=Au(t)+W0g(u(t))+W1g(u(tτ(t)))+I,u(t)=ϕ(t),τ¯t0,whereu(t)=[u1(t)u2(t)un(t)]T,is the state vector of the neural network, and n denotes the number of neurons,g(u(t))=[g1(u1(t))g2(u2(t))gn(un(t))]T,is the neuron activation function; ϕ(t) denotes the initial condition. In the neural network (1),A=diag(a1,a2,,an),where the scalars ai>0 is the rate with which

Main results

We first introduce the following lemma, which will be used in the proof of our main results.

Lemma 1

(Xu et al., 2003, Xu et al., 2004) Let D, S and be real matrices of appropriate dimensions with >0. Then for any vectors x and y with appropriate dimensions,2xTDSyxTDDTx+yTST1Sy.

Now, we are in a position to give an exponential stability condition for the delayed neural network in (7) under Case (I).

Theorem 2

Under Case (I), the origin of the delayed neural network in (7) is the unique equilibrium point and

Conclusions

This paper has studied the problem of exponential stability analysis for neural networks with time-varying delays. The activation functions are assumed to be globally Lipschitz continuous. Two cases of time-varying delays have been considered; that is, delays which are differentiable and have an upper bound of the delay-derivatives, and delays which are bounded but not necessary to be differentiable. An LMI approach has been developed and sufficient conditions have been obtained, which

References (21)

There are more references available in the full text version of this article.

Cited by (0)

This work is partially supported by HKU CRCG 10205878, the Program for New Century Excellent Talents in University, the National Natural Science Foundation of P.R. China under Grant 60304001, the Fok Ying Tung Education Foundation under Grant 91061, and the Foundation for the Author of National Excellent Doctoral Dissertation of P.R. China under Grant 200240.

View full text