On pth moment exponential stability of stochastic Cohen–Grossberg neural networks with time-varying delays☆
Introduction
In real nervous systems, there are various stochastic perturbations to the networks and it is important to understand how these perturbations affect the networks. Especially, it is very critical to know whether the networks are stable or not under the perturbations [1]. Recently, a great deal of results on stochastic neural networks with discrete delays have been reported in the literature, see e.g. [2], [3], [4], [5], [6], [7], [8], [9], [10]. pth moment exponential stability analysis is one of the most challenging, yet difficult problems in the field of stochastic neural networks [11]. In the aforementioned paper [11], the authors took the Razumikhin-type theorem to study pth moment exponential stability of the following system and obtained some interesting results:where is the neuron states vector; ; ; and are the connection weight matrix and delayed connection weight matrix, respectively; is the activation functions vector; ; is the diffusion coefficient matrix and is an n-dimensional Brownian motion defined on a complete probability space with a natural filtration (i.e. ). In order to obtain pth moment exponential stability of system (1), the following conditions are established in [11].
For each , there exist positive constants , such that .
For each , .
For each , there exists positive constant , such that
For each , there are non-negative constants , , such thatWe can read the following main theorem which was obtained in [11].
Theorem A Zhu [11]
Under assumptions , if there exists a positive diagonal matrix , such thatwhereand then for all , the trivial solution of system (1) is pth moment exponentially stable.
However, a defect appearing in the main result in [11] when , just from the constructed Lyapunov function, one can find the term is a blemish. In order to modify this imperfection, the constructed Lyapunov function should be replaced with . Noticing that , we have , it is easy to see that the assumed condition is not correct, which should be revised as the below . On the other hand, there is an error appear in (3), the coefficient of , the term should be replaced with .
For each , there exist positive constant , such thatIn this paper, we will generalize system (1) and further study the following non-autonomous stochastic functional differential equations:We will improve and generalize the corresponding results published in [11]. We assume the following generalized conditions are satisfied:
There exist positive functions , such that
There are non-negative functions , for , such that
Remark 1.1
Obviously, choosing , , we can see that conditions and generalize the conditions and , the significant of these generalization may provide engineer with a wider selection on the parameters in practice. For simplicity and avoiding lengthy statements, some symbols and notations with the same meanings as those in [11] are employed in the following derivations.
Section snippets
Main results
In this section, we will circumvent these problems of Theorem A and improve it. In order to prove the main result, we would like to present two lemmas firstly. Lemma 2.1 Assume there exist constants , then the following inequality holds:where and . In (6), if we let , we will getif we let , we will get Lemma 2.2 For twoHardy inequality, Cao and Liang [12]
Generalized Halanay inequality, Tian [13]
An illustrative example
In this section, we give one numerical example and simulations to illustrate the stability criteria presented in the previous part. Example 3.1 Consider the following stochastic recurrent neural networks with time-varying connection matrix and delayswhere is a bounded positive function and satisfies In the example, let
Conclusion
In a word, in this paper, with the help of Lyapunov function, stochastic analysis technique, generalized Halanay inequality and Hardy inequality, a set of novel sufficient conditions on pth moment exponential stability for non-autonomous stochastic Cohen–Grossberg neural networks is given, which modifies and generalizes the corresponding results in [11]. The significant of this paper does offer a wider selection on the networks parameters in order to achieve some necessary convergence in
Chuangxia Huang received the B.S. degree in Mathematics in 1999 from National University of Defense Technology, Changsha, China. From September 2002, he began to pursue his M.S. degree in Applied Mathematics at Hunan University, Changsha, China, and from April 2004, he pursued his Ph.D. degree in Applied Mathematics in advance at Hunan University. He received the Ph.D. degree in June 2006. He is currently an Associate Professor of Changsha University of Science and Technology, Changsha, China,
References (13)
- et al.
Mean square exponential stability of delayed Hopfield neural networks
Phys. Lett. A
(2005) - et al.
Exponential stability in p-th mean of solutions, and of convergent Euler-type solutions, of stochastic delay differential equations
Journal of Computational and Applied Mathematics
(2005) - et al.
Dynamic analysis of stochastic Cohen–Grossberg neural networks with time delays
Appl. Math. Comput.
(2006) - et al.
Exponential stability analysis of stochastic delayed cellular neural network
Chaos Solitons Fractals
(2006) - et al.
Stability analysis of stochastic delayed cellular neural networks by LMI approach
Chaos Solitons Fractals
(2006) - et al.
pth moment exponential stability of stochastic recurrent neural networks with time-varying delays
Nonlinear Anal. Real World Appl.
(2007)
Cited by (0)
Chuangxia Huang received the B.S. degree in Mathematics in 1999 from National University of Defense Technology, Changsha, China. From September 2002, he began to pursue his M.S. degree in Applied Mathematics at Hunan University, Changsha, China, and from April 2004, he pursued his Ph.D. degree in Applied Mathematics in advance at Hunan University. He received the Ph.D. degree in June 2006. He is currently an Associate Professor of Changsha University of Science and Technology, Changsha, China, and a Visiting Scholar at the Department of Mathematics, Southeast University, Nanjing, China. He is the author of more than 20 journal papers. His research interests include dynamics of neural networks, and stability theory of functional differential equations.
Jinde Cao received the B.S. degree from Anhui Normal University, Wuhu, China, the M.S. degree from Yunnan University, Kunming, China, and the Ph.D. degree from Sichuan University, Chengdu, China, all in Mathematics/Applied Mathematics, in 1986, 1989, and 1998, respectively. From March 1989 to May 2000, he was with the Yunnan University. In May 2000, he joined the Department of Mathematics, Southeast University, Nanjing, China. From July 2001 to June 2002, he was a Postdoctoral Research Fellow at the Department of Automation and Computer-Aided Engineering, Chinese University of Hong Kong, Hong Kong. In 2006, 2007 and 2008, he was a Visiting Research Fellow and a Visiting Professor at the School of Information Systems, Computing and Mathematics, Brunel University, UK.
Currently, he is a TePin Professor and Doctoral Advisor at the Southeast University, prior to which he was a Professor at Yunnan University from 1996 to 2000. He is the author or coauthor of more than 150 journal papers and five edited books. His research interests include nonlinear systems, neural networks, complex systems and complex networks, stability theory, and applied mathematics.
Dr. Cao is a senior member of IEEE, a Reviewer of Mathematical Reviews and Zentralblatt-Math, and an Associate Editor of the IEEE Transactions on Neural Networks, the Journal of the Franklin Institute, Mathematics and Computers in Simulation, and Neurocomputing.
- ☆
This work was jointly supported by China Scholarship Foundation under Grant no. 2008106456, the Foundation of Chinese Society for Electrical Engineering, the Hunan Provincial Natural Science Foundation of China under Grant no. 07JJ4001.