Global exponential stability of cellular neural networks with variable delays
Introduction
Time delay is commonly encountered in biological and artificial neural networks [1], [2], [3], [4], [5], [6], and its existence is frequently a source of oscillation and instability [7], [8], [9], [10]. Therefore, the problem of stability analysis of delayed neural networks has been a focused topic of theoretical and practical importance. This stability issue has also gained increasing attention for its essential role in signal processing, image processing, pattern classification, associative memories, fixed-point computation, and so on. Recently, using various analyzing methods, many criteria for global stability of neural networks with constant delays or time-varying delays have been presented [11], [12], [13], [14], [15], [16].
As we have known, fast convergence of a system is essential for real-time computation, and the exponentially convergence rate is generally used to determine the speed of neural computations. Thus, global exponential stability for neural networks without delays or with delays has been also investigated [17], [18], [19], [20] in very recent years.
In this paper. we study the exponential stability and estimate the exponential convergence rates for neural networks with time-varying delays. Lyapunov–Krasovskii functionals and linear matrix inequality (LMI) approaches are combined to investigate the problem. A novel delay-dependent criterion is presented in terms of LMI. The advantage of the proposed approach is that the resulting stability criterion can be used efficiently via existing numerical convex optimization algorithms such as the interior-point algorithms for solving LMIs [21].
Throughout the paper, denotes the n dimensional Euclidean space, and is the set of all n × m real matrices. I denotes the identity matrix with appropriate dimensions. ∥x∥ denotes the Euclidean norm of vector x. λM(·) and λm(·) denote the largest and smallest eigenvalue of a given matrix, respectively. ★ denotes the elements below the main diagonal of a symmetric block matrix. diag{·} denotes the block diagonal matrix. For symmetric matrices X and Y, the notation X > Y (respectively, X ⩾ Y) means that the matrix X − Y is positive definite (respectively, nonnegative).
Section snippets
Main results
Consider a continuous neural networks with time-varying delays can be described by the following state equations:or equivalentlywhere is the neuron state vector, is the activation functions, , b = [b1, … , bn]T is a constant input vector, A = diag(ai) is a positive diagonal matrix, W = (
Concluding remarks
In this paper, the problems of exponential stability and exponential convergence rate for neural networks with time-varying delays have been studied. The exponential stability criterion obtained in this paper, which depends on the size of the time delay, are derived by the Lyapunov–Krasoviskii functional and LMI framework. It has been shown that our novel criterion is less conservative than those reported in the literature by numerical examples.
References (22)
Global stability of bidirectional associative memory neural networks with distributed delays
Phys. Lett. A
(2002)Global exponential stability of BAM neural networks with delays and impulses
Chaos Solitons Fractals
(2005)- et al.
Existence and stability of almost periodic solution for BAM neural networks with delays
Appl. Math. Comput.
(2003) - et al.
Exponential stability of continuous-time and discrete-time bidirectional associative memory networks with delays
Chaos Solitons Fractals
(2004) - et al.
Stability analysis of delayed cellular neural networks
Neural Networks
(1998) Global asymptotic stability of a larger class of neural networks with constant time delay
Phys. Lett. A
(2003)A novel criterion for global asymptotic stability of BAM neural networks with time delays
Chaos Solitons Fractals
(2006)- et al.
Global asymptotic stability of Hopfield neural networks with transmission delays
Phys. Lett. A
(2003) - et al.
Delay-dependent exponential stability analysis of delayed neural networks: an LMI approach
Neural Networks
(2002) - et al.
New exponential stability results for delayed neural networks with time varying delays
Physica D
(2004)