New delay-dependent exponential stability criteria of BAM neural networks with time delays

https://doi.org/10.1016/j.matcom.2008.08.014Get rights and content

Abstract

In this paper, the global exponential stability is investigated for the bi-directional associative memory networks with time delays. Several new sufficient conditions are presented to ensure global exponential stability of delayed bi-directional associative memory neural networks based on the Lyapunov functional method as well as linear matrix inequality technique. To the best of our knowledge, few reports about such “linearization” approach to exponential stability analysis for delayed neural network models have been presented in literature. The method, called parameterized first-order model transformation, is used to transform neural networks. The obtained conditions show to be less conservative and restrictive than that reported in the literature. Two numerical simulations are also given to illustrate the efficiency of our result.

Introduction

Kosko has proposed a series of neural networks related to bi-directional associative memory (BAM) neural networks with the ability of information memory and information association [11], [12], [13]. However, dynamics of neural networks often have time delays due to the finite switching speed of amplifiers in electronic neural networks, or to the finite signal propagation time in biological networks. Time delay will affect the stability of a network by creating instability, oscillation and chaos phenomena. In view of this, the stability issue of delayed neural networks is a topic of great practical importance [1], [7], [14], [16], [26], which has gained increasing interest in the potential applications in signal processing, optimization problem, image processing and other fields [8], [20].

Recently, some sufficient conditions for the stability of BAM networks with time delays have been derived (see [2], [4], [6], [9], [10], [15], [17], [19], [22], [23], [24], [25], [27], [28]). In [4], the global asymptotic stability and the existence of equilibrium are considered as continuous bi-directional associative memory (BAM) neural networks with axonal signal transmission delay by using Lyapunov method and new sufficient conditions are derived to ascertain the global asymptotic stability of BAM. In [28], the authors present a new sufficient condition for the existence, uniqueness and global exponential stability of the equilibrium point for bi-directional associative memory (BAM) neural networks with constant delays. Huang et al. [10] proposed a set of criteria for the exponential stability of BAM neural networks with constant or time-varying delays, based on the Lyapunov–Krasovskii functional in combination with linear matrix inequality (LMI) approach. These criteria manifest explicitly the influence of time delay on exponential convergence rate and show the differences between the excitatory and inhibitory effect.

In this paper, inspired by [1], [7], [14], [16], [26], we present some criteria on the exponential stability and estimate the exponential convergence rate for BAM neural networks with time delays. Some new conditions for global exponential stability of BAM with constant delay are given in terms of LMIs by constructing a suitable Lyapunov functional. Firstly, we shifted the nonlinear neural network model to the linear one by employing a simple transformation. Secondly, a process, which is called a parameterized first-order model transformation, is used to transform the linear system. Then, we establish novel sufficient conditions to ensure the existence, uniqueness, and global exponential stability of equilibrium point for BAM neural networks with time delays. The obtained result in this paper is different from those in the earlier literatures [5], [10], [15], [27]. Compared with the earlier results in the literature, those results are less restrictive and conservative.

The rest of this paper is organized as follows. In Section 2, the problem to be studied is stated and some needed preliminaries and lemmas are given. In Section 3, some global exponential stability criteria are presented for delayed BAM neural networks by means of the Lyapunov-type stability theorem and linear matrix inequality (LMI). In Section 4, two illustrative examples are given to show the effectiveness of our results. Finally, some conclusions are drawn in Section 5.

Section snippets

Neural network model and preliminaries

In this paper, we consider the following BAM neural network model with time delays:u˙i(t)=aiui(t)+j=1mwijfj(hj(tτ))+Ii,i=1,2,,n,h˙j(t)=bjhj(t)+i=1nvjigi(ui(tσ))+Jj,j=1,2,,m,or equivalentlyu˙(t)=Au(t)+Wf(h(tτ))+I,h˙(t)=Bh(t)+Vg(u(tσ))+J,where ai and bi denote the neuron charging time constants and passive decay rates, respectively, wij and vji are the synaptic connection strengths, fj and gi represent the activation functions of the neurons and the propagational signal functions,

Global exponential stability analysis for delayed BAM neural networks

In this section, we present sufficient conditions for the uniqueness and global exponential stability of the equilibrium point for the delayed BAM neural networks.

In Ref. [18], the equalityx(tτ)=x(t)τ0x˙(t+ξ)dξ=x(t)τ0[Ax(t+ξ)+Adx(t+ξτ)]ξwas used to transform the systemx˙(t)=Ax(t)+Adx(tτ)into a distributed delay systemx˙(t)=(A+C)x(t)+(AdC)x(tτ)Cτ0[Ax(t+θ)+Adx(t+θτ)]dθ,where C is a parameter matrix which makes the stability result less restrictive to some degree. Such process is

Examples

In this section, we give two examples to illustrate our results.

Example 1

Consider the following system [10]:

x˙1(t)=x1(t)+0.05f(y1(tτ))+0.25f(y2(tτ))+0.05f(y3(tτ)),x˙2(t)=x2(t)+0.1f(y1(tτ))+0.05f(y2(tτ))+0.15f(y3(tτ)),x˙3(t)=x3(t)+0.15f(y1(tτ))+0.15f(y2(tτ))+0.05f(y3(tτ)),y˙1(t)=4y1(t)+0.75f(x1(tσ))+0.75f(x2(tσ))+0.95f(x3(tσ)),y˙2(t)=4y2(t)+0.5f(x2(tσ))+0.75f(x3(tσ)),y˙3(t)=4y3(t)+0.15f(x1(tσ))+0.95f(x2(tσ))+0.95f(x3(tσ)),where the activation function is described by f(x) = 1/2(|x + 1| 

Conclusions

We have investigated the global exponential stability problem via a novel approach. By employing a simple transformation, we first shifted the nonlinear neural network model to the linear one. Then, via an approach combining the Lyapunov stability theorem and linear matrix inequality (LMI) technique, we have derived some new sufficient conditions for the global exponential stability of a general class of delayed BAM neural networks. In comparison with some recent results reported in the

Acknowledgements

The authors are greatly indebted to anonymous referees for their constructive comments. The work described in this paper was partially supported by the National Natural Science Foundation of China (Grant No. 60573047, 60574024) and Natural Science Foundation Project of CQ CSTC (Grant No. 2008BB2366, 2007BB2231), Program for New Century Excellent Talents in University, the Doctoral Foundation Project of Chongqing Normal University (Grant No. 08XLB003) and the Applying Basic Research Program of

References (28)

Cited by (13)

  • Global robust asymptotic stability of variable-time impulsive BAM neural networks

    2014, Neural Networks
    Citation Excerpt :

    All of these applications tediously depend on dynamical behaviors of the network and require that the equilibrium point of the model is globally asymptotically stable. There are many studies on the global stability analysis and other dynamical behaviors, like periodicity/almost periodicity, of the BAM neural networks (Cao, Liang, & Lam, 2004; Chen, Huang, Liu, & Cao, 2006; Jalab & Ibrahim, 2009; Song, Han, & Wei, 2005; Song & Wang, 2007; Wang, 2014; Wang & Zou, 2005; Yang, Liao, Hu, & Wang, 2009; Zhang & Liu, 2011; Zhang, Qiu, & She, 2014; Zhang & Si, 2007) and references therein. In addition to these, the instantaneous perturbations and abrupt changes in the voltages at certain instant, which are produced by circuit elements, are exemplary of impulsive phenomena that can affect the transient behavior of the neural networks.

  • Novel LMI-based condition on global asymptotic stability for BAM neural networks with reaction-diffusion terms and distributed delays

    2014, Neurocomputing
    Citation Excerpt :

    Our result on global stability is also different from those obtained in [36,37]. Compared with the results in [35,37–39,41], in our results, the boundedness in [35,37–39,41] and monotonicity in [37] on the activation functions are removed in our paper and we just require that the activation functions are only globally Lipschitz continuous. On the other hand, in LMI condition, the LMI matrices in our paper do not contain the boundedness of delays, while in [37], the LMI matrices contain the boundedness of delays.

  • A new type of recurrent neural networks for real-time solution of Lyapunov equation with time-varying coefficient matrices

    2013, Mathematics and Computers in Simulation
    Citation Excerpt :

    The Lyapunov (or Lyapunov-like) matrix equations are encountered in many different scientific and engineering areas, such as boundary value problems [1], large space flexible structures [3], linear algebra [12], signal processing [19], control theory [2], and optimization [18,13]. Especially, when we make use of Lyapunov's direct method to analyze the stability and/or robustness properties of nonlinear dynamic systems, linear dynamic systems with uncertainty (or reliability problems), and/or to design optimal control systems [21], we could encounter Lyapunov-like matrix equations. Due to their wide utility in many fields of science, engineering and business, a great deal of numerical algorithms have been proposed for the solution of Lyapunov equation in many literatures [6,16,4,12].

  • Attracting and invariant sets of non-autonomous reaction-diffusion neural networks with time-varying delays

    2012, Mathematics and Computers in Simulation
    Citation Excerpt :

    The existence of time delays may destroy a stable network and cause sustained oscillations, bifurcation or chaos and thus could be harmful. Therefore, the stability of neural networks with delay have been studied widely (see Refs. [6,9,11,12,15,22,21,27,29]). However, strictly speaking, diffusion effects cannot be avoided in the neural networks when electrons are moving in asymmetric electromagnetic fields.

View all citing articles on Scopus
View full text