Elsevier

Neurocomputing

Volume 70, Issues 7–9, March 2007, Pages 1382-1389
Neurocomputing

An analysis of global exponential stability of bidirectional associative memory neural networks with constant time delays

https://doi.org/10.1016/j.neucom.2006.06.003Get rights and content

Abstract

This paper presents a new sufficient condition for the existence, uniqueness and global exponential stability of the equilibrium point for bidirectional associative memory (BAM) neural networks with constant delays. The results are also compared with the previous results in the literature, implying that the results obtained in this paper provide one more set of criteria for determining the global exponential stability of BAM neural networks with constant delays.

Introduction

Neural networks have been intensively studied in the past decade and have been applied in various fields such as designing associative memories and solving optimization problems. The applications crucially rely on the dynamical behavior of the designed neural network. Therefore, studying the stabilities of equilibria is of prime importance. In recent years, researchers [1], [2], [3], [4], [5], [6], [7], [8], [10], [11], [12], [13], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27] have paid particular attention to bidirectional associative memory (BAM) neural networks with time delays because the neural networks have been shown to be useful models in pattern recognition, optimization solving and automatic control engineering. Some useful results about the uniqueness and global asymptotic stability of the equilibrium for BAM neural networks with delays can be found in [3], [4], [5], [6], [7], [8], [15], [18], [22], [24]. As we know, fast convergence of a system is essential for real-time computation, and the rate of the exponential convergence is used to determine the speed of neural computations. On the other hand, the exponential stability property guarantees that, whatever transformation occurs, the networks’ ability to store rapidly the activity pattern remains invariant due to self-organization. Thus, it is not only theoretically interesting, but also practically important to determine the exponential stability and estimate the exponential convergence rate for BAM neural networks. Global exponential stability for neural networks with delays has been investigated and some sufficient conditions have been derived in [6], [7], [8], [15]. In this paper, we focused on global exponential stability of BAM neural networks with time delays. By constructing a suitable Lyapunov functional, we obtained some new criteria for global exponential stability of BAM neural networks with time delays. These criteria generalized previous results.

Consider bidirectional memory neural network model with delays described by the state equations [4]:u˙i(t)=-aiui(t)+j=1mwijsj(1)(zj(t-τ1))+Ji(1),i=1,2,,n,z˙j(t)=-bjzj(t)+i=1nvjisi(2)(ui(t-τ2))+Jj(2),j=1,2,,m,where ai and bj denote the neuron charging time constants and passive decay rates, respectively; wij and vji are synaptic connection strengths; si(1) and sj(2) represent the activation functions of the neurons and the propagational signal functions, respectively; Ji(1) and Jj(2) are the exogenous inputs. In order to establish the global exponential stability conditions for neural networks (1.1) and make a precise comparison between our stability conditions and previous results derived in the literature, we take some traditional assumptions on the activation functions:

Assumption A1

There exist some positive constants αi,i=1,2,,m, and βj,j=1,2,,n, such that 0si(1)(x)-si(1)(y)x-yαi,i=1,2,,mx,yR,xy,and 0sj(2)(x)-sj(2)(y)x-yβj,j=1,2,,nx,yR,xy.

Assumption A2

The activation functions are bounded; that is, |si(1)(x)|Mi(1)(i=1,2,,m), and |sj(2)(x)|Mj(2)(j=1,2,,n), where Mi(1) and Mj(2) denote some positive constants.

Let τ=max{τ1,τ2}, initial conditions for (1.1) are of the formφ=(φ1,φ2,,φn,φn+1,φn+2,,φn+m)C=C([-τ,0],Rn+m).Here, C is the continuous function space with the norm φ2=sup-τt0φT(t)φ(t). For any initial value condition φ=(φ1,φ2,,φn+m)C, systems (1.1) admit a unique solution, denoted x(t,φ)=(u(t,φ),z(t,φ))=(u1(t,φ),u2(t,φ),,un(t,φ),z1(t,φ),z2(t,φ),,zm(t,φ)). To simplify the notations, the dependence on the initial condition φ will not be indicated unless necessary.

For convenience, set B=(bij), a real n×n dimensional matrix. The BT and B-1, respectively, represent the transpose and the inverse of the matrix B. The notation B>0(B<0) means that B is symmetric and positive definite (negative definite). The B2 represents the norm of B induced by the Euclidean vector norm, i.e. B2=(λ(BTB))1/2, where λ(M) represents the maximum eigenvalue of matrix M. The In denotes the n×n dimensional identical matrix. Sometimes we write x(t) as x, f(x(t)) as f(x) and the transpose of A-1 as A-T.

We will need the following definition:

Definition 1.1 Khalil [14]

The equilibrium point x*=(u*,z*) is said to be globally exponentially stable, if there exist positive constants k>0 and γ such that for any solution x(t)=(u(t),z(t)) of systems (1.1) with initial function φC([-τ,0],Rn+m), there holds x(t)-x*2γφ-x*2e-ktfor allt0.

The organization of this paper is as follows. In Section 2, we establish a new criterion on the global exponential stability of the equilibrium point for neural networks with delays. Some comparison will be provided in Section 3. Section 4 presents our conclusions.

Section snippets

Main results

Based on some facts about positive-definite matrices and integral inequalities, we present the main results in this section.

Firstly, we have the following lemma due to [27]:

Lemma 2.1

Given any real matrices X,Y,C of appropriate dimensions and a scalar ε0>0, where C>0. Then the following inequality holds: XTY+YTXε0XTCX+1ε0YTC-1Y.In particular, if X and Y are vectors, XTY(XTX+YTY)/2.

Before we establish a criterion for the global exponential stability of systems (1.1), we give a lemma concerning the

Comparison with previous results

In this section, we compare our results with the previous results derived in the literature. Let us restate the previous stability results as follows.

Theorem 3.1

Arik [4]

Suppose that in systems (2.9), Assumptions A1 and A2 are satisfied. The origin of neural systems (2.9) is globally asymptotically stable if the following conditions hold:Ω3=-α-1Bα-1-2Bα-1+VVT+Im+WTA-1W<0,Ω4=-β-1Aβ-1-2Aβ-1+WWT+In+VTB-1V<0.

Theorem 3.2

Chen et al. [7]

Suppose that in systems (2.9), Assumption A1 is satisfied. The origin of neural systems (2.9) is globally

Conclusion

In this paper, we obtained new results for the global exponential stability properties of BAM neural networks with delays. By the Lyapunov functional method and the technique of inequality of integral, the global exponential stability criteria were derived. A comparison between our results and the previous results implies that our results establish a new set of global exponential stability criteria for BAM neural networks with constant delays. Those conditions are less restrictive than those in

Acknowledgment

The authors would like to thank the anonymous reviewers and the editor for their constructive comments.

Weirui Zhao received the B.S. degree in mathematics from Huazhong Normal University, Wuhan, China, the M.S. degree and the Ph.D. degree from Fudan University, Shanghai, China, all in mathematics/applied mathematics, in 1989, 1994, and 2003, respectively. From June 1989 to September 2000, he was with Hubei Institute for Nationalities. In June 2003, he joined the Department of Mathematics, Wuhan University of Technology, Wuhan, China. Currently, he is a Postdoctoral Research Fellow in the

References (27)

Cited by (0)

Weirui Zhao received the B.S. degree in mathematics from Huazhong Normal University, Wuhan, China, the M.S. degree and the Ph.D. degree from Fudan University, Shanghai, China, all in mathematics/applied mathematics, in 1989, 1994, and 2003, respectively. From June 1989 to September 2000, he was with Hubei Institute for Nationalities. In June 2003, he joined the Department of Mathematics, Wuhan University of Technology, Wuhan, China. Currently, he is a Postdoctoral Research Fellow in the Shenzhen Graduate School, Harbin Institute of Technology. He is the author or coauthor of 10 journal papers and conference papers. His research interests include stability theory, nonlinear systems, neural networks, and applied mathematics.

Huanshui Zhang graduated in mathematics from the Qufu Normal University in 1986 and received his M.S. and Ph.D. degrees in control theory and signal processing from the Heilongjiang University, PR China, and Northeastern University, PR China, in 1991 and 1997, respectively. He worked as a Postdoctoral Fellow at the Nanyang Technological University from 1998 to 2001 and a Research Fellow at Hong Kong Polytechnic University from 2001 to 2003. He joined Shandong Taishan College in 1986 as an Assistant Professor and became an Associate Professor in 1994. He is currently a Professor in Shenzhen Graduate School of Harbin Institute of Technology. His interests include optimal estimation, robust filtering and control, time delay systems, singular systems, wireless communication, signal processing, and neural networks.

Shulan Kong is an Associate Professor at School of Mathematical Sciences, Qufu Normal University. She graduated in mathematics from Yantai Normal College in 1992 and received her M.S. and Ph.D. degrees in mathematics from Qufu Normal University and Shandong University in 1995 and 2004, respectively. Currently, she is a Postdoctor of Shenzhen Graduate School, Harbin Institute of Technology. Her research interests include nonlinear control systems, wireless communication systems, combinatorial optimization, and neural networks.

View full text