An analysis of global exponential stability of bidirectional associative memory neural networks with constant time delays
Introduction
Neural networks have been intensively studied in the past decade and have been applied in various fields such as designing associative memories and solving optimization problems. The applications crucially rely on the dynamical behavior of the designed neural network. Therefore, studying the stabilities of equilibria is of prime importance. In recent years, researchers [1], [2], [3], [4], [5], [6], [7], [8], [10], [11], [12], [13], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27] have paid particular attention to bidirectional associative memory (BAM) neural networks with time delays because the neural networks have been shown to be useful models in pattern recognition, optimization solving and automatic control engineering. Some useful results about the uniqueness and global asymptotic stability of the equilibrium for BAM neural networks with delays can be found in [3], [4], [5], [6], [7], [8], [15], [18], [22], [24]. As we know, fast convergence of a system is essential for real-time computation, and the rate of the exponential convergence is used to determine the speed of neural computations. On the other hand, the exponential stability property guarantees that, whatever transformation occurs, the networks’ ability to store rapidly the activity pattern remains invariant due to self-organization. Thus, it is not only theoretically interesting, but also practically important to determine the exponential stability and estimate the exponential convergence rate for BAM neural networks. Global exponential stability for neural networks with delays has been investigated and some sufficient conditions have been derived in [6], [7], [8], [15]. In this paper, we focused on global exponential stability of BAM neural networks with time delays. By constructing a suitable Lyapunov functional, we obtained some new criteria for global exponential stability of BAM neural networks with time delays. These criteria generalized previous results.
Consider bidirectional memory neural network model with delays described by the state equations [4]:where and denote the neuron charging time constants and passive decay rates, respectively; and are synaptic connection strengths; and represent the activation functions of the neurons and the propagational signal functions, respectively; and are the exogenous inputs. In order to establish the global exponential stability conditions for neural networks (1.1) and make a precise comparison between our stability conditions and previous results derived in the literature, we take some traditional assumptions on the activation functions: Assumption There exist some positive constants , and , such that and Assumption The activation functions are bounded; that is, , and , where and denote some positive constants.
Let , initial conditions for (1.1) are of the formHere, C is the continuous function space with the norm . For any initial value condition , systems (1.1) admit a unique solution, denoted . To simplify the notations, the dependence on the initial condition will not be indicated unless necessary.
For convenience, set , a real dimensional matrix. The and , respectively, represent the transpose and the inverse of the matrix B. The notation means that B is symmetric and positive definite (negative definite). The represents the norm of B induced by the Euclidean vector norm, i.e. , where represents the maximum eigenvalue of matrix M. The denotes the dimensional identical matrix. Sometimes we write as x, as and the transpose of as .
We will need the following definition: Definition 1.1 Khalil [14] The equilibrium point is said to be globally exponentially stable, if there exist positive constants and such that for any solution of systems (1.1) with initial function , there holds
The organization of this paper is as follows. In Section 2, we establish a new criterion on the global exponential stability of the equilibrium point for neural networks with delays. Some comparison will be provided in Section 3. Section 4 presents our conclusions.
Section snippets
Main results
Based on some facts about positive-definite matrices and integral inequalities, we present the main results in this section.
Firstly, we have the following lemma due to [27]: Lemma 2.1 Given any real matrices of appropriate dimensions and a scalar , where . Then the following inequality holds: In particular, if X and Y are vectors, .
Before we establish a criterion for the global exponential stability of systems (1.1), we give a lemma concerning the
Comparison with previous results
In this section, we compare our results with the previous results derived in the literature. Let us restate the previous stability results as follows. Theorem 3.1 Suppose that in systems (2.9), Assumptions and are satisfied. The origin of neural systems (2.9) is globally asymptotically stable if the following conditions hold: Theorem 3.2 Suppose that in systems (2.9), Assumption is satisfied. The origin of neural systems (2.9) is globallyArik [4]
Chen et al. [7]
Conclusion
In this paper, we obtained new results for the global exponential stability properties of BAM neural networks with delays. By the Lyapunov functional method and the technique of inequality of integral, the global exponential stability criteria were derived. A comparison between our results and the previous results implies that our results establish a new set of global exponential stability criteria for BAM neural networks with constant delays. Those conditions are less restrictive than those in
Acknowledgment
The authors would like to thank the anonymous reviewers and the editor for their constructive comments.
Weirui Zhao received the B.S. degree in mathematics from Huazhong Normal University, Wuhan, China, the M.S. degree and the Ph.D. degree from Fudan University, Shanghai, China, all in mathematics/applied mathematics, in 1989, 1994, and 2003, respectively. From June 1989 to September 2000, he was with Hubei Institute for Nationalities. In June 2003, he joined the Department of Mathematics, Wuhan University of Technology, Wuhan, China. Currently, he is a Postdoctoral Research Fellow in the
References (27)
An analysis of exponential stability of delayed neural networks with time varying delays
Neural Networks
(2004)- et al.
Global asymptotic stability analysis of bidirectional associative memory neural networks with constant time delays
Neurocomputing
(2005) - et al.
Exponential stability of delayed bidirectional associative memory networks
Appl. Math. Comput.
(2003) - et al.
Exponential stability of BAM neural networks with transmission delays
Neurocomputing
(2004) - et al.
Analysis of global exponential stability and periodic solutions of neural networks with time-varying delays
Neural Networks
(2005) - et al.
Globally exponential stability of generalized Cohen–Grossberg neural networks with delays
Phys. Lett. A
(2003) - et al.
Some new results for recurrent neural networks with varying-time coefficients and delays
Phys. Lett. A
(2005) - et al.
A new criterion on the global exponential stability for cellular neural networks with multiple time-varying delays
Phys. Lett. A
(2005) - et al.
Delay-dependent exponential stability analysis of bi-directional associative memory neural networks with time delay: an LMI approach
Chaos Solitons Fractals
(2005) - et al.
Global stability of cellular neural networks with constant and variable delays
Nonlinear Anal.
(2003)
Globally exponential stability condition of a class of neural networks with time-varying delays
Phys. Lett. A
Absolute exponential stability analysis of delayed neural networks
Phys. Lett. A
Delay-dependent exponential stability of cellular neural networks with time-varying delays
Chaos Solitons Fractals
Cited by (0)
Weirui Zhao received the B.S. degree in mathematics from Huazhong Normal University, Wuhan, China, the M.S. degree and the Ph.D. degree from Fudan University, Shanghai, China, all in mathematics/applied mathematics, in 1989, 1994, and 2003, respectively. From June 1989 to September 2000, he was with Hubei Institute for Nationalities. In June 2003, he joined the Department of Mathematics, Wuhan University of Technology, Wuhan, China. Currently, he is a Postdoctoral Research Fellow in the Shenzhen Graduate School, Harbin Institute of Technology. He is the author or coauthor of 10 journal papers and conference papers. His research interests include stability theory, nonlinear systems, neural networks, and applied mathematics.
Huanshui Zhang graduated in mathematics from the Qufu Normal University in 1986 and received his M.S. and Ph.D. degrees in control theory and signal processing from the Heilongjiang University, PR China, and Northeastern University, PR China, in 1991 and 1997, respectively. He worked as a Postdoctoral Fellow at the Nanyang Technological University from 1998 to 2001 and a Research Fellow at Hong Kong Polytechnic University from 2001 to 2003. He joined Shandong Taishan College in 1986 as an Assistant Professor and became an Associate Professor in 1994. He is currently a Professor in Shenzhen Graduate School of Harbin Institute of Technology. His interests include optimal estimation, robust filtering and control, time delay systems, singular systems, wireless communication, signal processing, and neural networks.
Shulan Kong is an Associate Professor at School of Mathematical Sciences, Qufu Normal University. She graduated in mathematics from Yantai Normal College in 1992 and received her M.S. and Ph.D. degrees in mathematics from Qufu Normal University and Shandong University in 1995 and 2004, respectively. Currently, she is a Postdoctor of Shenzhen Graduate School, Harbin Institute of Technology. Her research interests include nonlinear control systems, wireless communication systems, combinatorial optimization, and neural networks.