Elsevier

Neurocomputing

Volume 266, 29 November 2017, Pages 409-419
Neurocomputing

Improved criteria of delay-dependent stability for discrete-time neural networks with leakage delay

https://doi.org/10.1016/j.neucom.2017.05.053Get rights and content

Abstract

This paper is concerned with the stability analysis of discrete-time neural networks with leakage and time-varying delays. By a novel summation inequality, the technique of reciprocally convex combination and triple Lyapunov–Krasovskii terms, the various cases of time-delay are discussed in detail and improved criteria are established to ensure the delay-dependent stability of discrete-time neural networks. Finally, three examples are given to verify the effectiveness of the proposed methods.

Introduction

Over the past few decades, many researchers have paid much attention to neural networks because of its wide application in image processing, signal processing, fault diagnosis, pattern recognition, combinatorial optimization, associative memory and so on. A series of dynamic behaviors such as stability, instability, periodic oscillation and chaos have a great influence on neural networks. In view of these, many important results had been reported in the literature, see [3], [10], [21], [25], [26], [28], [33], [35] and the references therein.

It is worth mentioning that most of neural networks are concerned with continuous-time systems, however, the discrete-time neural networks are more important than the continuous-time counterparts in practical applications such as bidirectional associative memory, nonlinear output regulation and adaptive tracking, [2], [11], [17], [18], [23]. Moreover, simulation or computation is essential for continuous-time neural networks. So it is necessary to discretize continuous-time neural networks to formulate discrete-time systems. Therefore, the stability of discrete-time neural networks gradually began to get the attention of researchers. Both analysis and synthesis problems of discrete-time neural networks are reported in a large number of articles [1], [6], [11], [12], [14], [15], [22], [24], [31]. In [4], Kwon et al. researched discrete-time neural networks with time-varying delays, where several sufficient conditions to ensure delay-dependent stability were constructed by a newly augmented Lyapunov–Krasovskii function. Utilizing the augmented Lyapunov–Krasovskii functions and considering some zero equalities with new delay partitioning law, the results in [4] had been improved in [7]. The results were further improved by Shu et al. in [8] by employing a novel summation inequality. In [9], Jin et al. proposed an improved summation inequality, together with Wirtinger-based inequality and less decision variables than [8]. Also, a many number of results have been reported on the stability of Markovian jump neural networks with delays in [27], [30], [31], [34].

For the convenience of analysis, leakage delay was often neglected in modeling for most of systems in the past. For example, in [29], the authors used a novel sum inequality to study the problem of stability of neural networks with time-varying delays and gave conditions in terms of LMIs which make the system globally asymptotically stable. By exploiting all possible information in mixed time delays, a sufficient condition for the whole system to be mean-square exponentially stable was given in [13]. In [19], it investigated the problem of exponential synchronization of discrete-time neural networks with mixed time delays, actuator saturation and failures. The stability of uncertain neural networks was study in [2], [5], [17]. However, they all did not take into account the leakage delay, which is inevitable and the inherent defects of the system. Of course, there were several articles taking the leakage delay into account. For example, to construct Lyapunov functionals with triple summation, the double sum was processed by reciprocal convex combination technique. The robust stability was studied for a class of discrete-time neural networks with leakage delay and parameter uncertainties in [16]. In addition, in [6], [20], [22], the authors also studied discrete-time systems with leakage delay. Nevertheless, in [16], although the leakage delay was considered, the complexity of Theorem 3.1 in [16] was increased since introduction of the free weight matrix Π=[Π1Π2]. On the contrary, in this paper, we use the updated inequality method to obtain more conservative stability conditions with less decision variables. What’s more, in terms of the size of the time-varying delay interval, especially in the case of short delay interval, we give a more detailed classification due to the need of derivation process.

Based on the above discussion, the research work in this paper is motivated to solve the problem of global asymptotic stability analysis of discrete-time neural networks with leakage delay, time-varying delays and random parameter uncertainties. The relaxed asymptotic stability criteria are presented in terms of LMIs. The advantage of this paper lies in that it is the first attempt to use a new sum inequality method (i.e., Corollary 2.1) to deal with these terms τ1i=kτMkτm1ηT(i)Z2η(i) and σi=kσk1ηT(i)Z3η(i) for discrete-time neural networks with leakage delay and time-varying delays. Secondly, less conservative conditions and less decision variables benefit from the appropriate Lyapunov-Krasovskii functions including triple LKFs and utilizing reciprocally convex inequality. What’s more, in terms of two new activation function conditions Fif^i(s1)f^i(s2)s1s2Fi+ and Gig^i(s1)g^i(s2)s1s2Gi+, in this paper, Fi,Fi+,Gi and Gi+ are assumed to be constant. And they are not limited to be positive, negative or zero. Thirdly, according to different values of τ(k) and the size of the delay interval, especially in the case of short delay interval (i.e., τ1=τMτm=1 or τ1=τMτm=2), this paper gives a more detailed classification of time-varying delays for the first time. For convenience, we firstly attempt to use a new standard α0=0or1,α1=0or1,α2=0or1,β0=0or1 and β1=0or1 to represent different situations in our results. The detailed discussion is not taken into account in the existing literature. However, this discussion is necessary for discrete systems. Finally, three numerical examples are given to demonstrate the feasibility of the results.

Notation 1.1

Throughout this paper, ℜn denotes n-dimensional Euclidean space and ℜn × n is the set of all n × n real matrices. For symmetric matrices X and Y, the notation X > Y (XY) means that the matrix XY is positive definite (nonnegative). The superscript T and (1) denote the matrix transposition and matrix inverse, respectively. sym(A)=A+AT. I is an identity matrix with appropriate dimension. The notation * always denotes the symmetric block in one symmetric matrix. Without special stated, the matrix has the appropriate dimension.

Section snippets

Preliminaries

Consider the following neural network with leakage delay as {y(k+1)=Ay(kσ)+Bf^(y(k))+Cg^(y(kτ(k)))+J,k0y(k)=φ(k),k=ρ,ρ+1,,0,where y(k)=[y1(k),y2(k),,yn(k)]TRn is the neural state vector, f^T(y(·))=[f^1(y1(·)),f^2(y2(·)),,f^n(yn(·))]Tn and g^T(y(·))=[g^1(y1(·)),g^2(y2(·)),,g^n(yn(·))]Tn denotes the neuron activation function, and J=(J1,J2,,Jn)Tn is a constant input vector. σ represents the leakage delay satisfying σ ≥ 1, A=diag{a1,a2,,an}n×n with ∣ai∣ < 1 is the leakage delay

Main results

In this section, by using of Lyapunov–Krasvskii functions, new globally asymptotical stability criteria for systems (2) will be proposed. For representation convenience, the following notations are introduced.

When dealing with the third term of ΔV4(k) and the latter two terms of ΔV6(k), we analyze different cases for τ1 and τ(k) to meet the needs of Corollary 2.1 and its subsequent further inequality. We introduce a novel standard (i.e., α0, α1, α2, β0, and β1) to solve the problem. α0={1,whenτ1

Examples

The following is three simulation examples of Theorem 3.2, system (39) and Corollary 3.1.

Example 1

Consider a uncertain discrete-time neural network as follows: {x(k+1)=Ax(kσ)+Bf(x(k))+Cg(x(kτ(k)))+Mq(k),q(k)=F(k)ς(k),ς(k)=α(k)Eax(kσ)+β(k)Ebf(x(k))+γ(k)Ecg(x(kτ(k))),x(k)=ϕ(k),k=ρ,ρ+1£¬,0.The parameters of the above neural network are as follows: A=[0.9000.8],B=[1.401.90.5],C=[1.10.40.10.08],Ea=[0.060.0200.02],Eb=[0.0800.020.04],Ec=[0.010.030.020.02],M=[0.05000.012],F(k)=[sin(k)00cos(k)],

Conclusions

In this paper, improved globally asymptotical stability conditions for discrete-time neural networks with leakage delay and randomly occurring uncertainties are proposed. In addition, the different situations of delay are discussed in detail. By the generalized discrete Jensen inequality, the technique of reciprocally convex combination and constructing triple Lyapunov–Krasovskii terms, improved criteria of delay-dependent stability are derived. Finally, numerical examples have been given to

Yaonan Shan was born in Anyang, China. She received B.S. degree from Zhongyuan University of Technology, Zhengzhou, China, in 2015. Now she is working towards the master degree in School of Mathematical Sciences at the University of Electronic Science and Technology of China. Her current research interests including stability theorem, the robustness stability, time-delay system and neural networks.

References (35)

Cited by (0)

Yaonan Shan was born in Anyang, China. She received B.S. degree from Zhongyuan University of Technology, Zhengzhou, China, in 2015. Now she is working towards the master degree in School of Mathematical Sciences at the University of Electronic Science and Technology of China. Her current research interests including stability theorem, the robustness stability, time-delay system and neural networks.

Shouming Zhong was born on November 5, 1955. He graduated from University of Electronic Science and Technology of China, majoring applied mathematics on Differential equation. He is a professor of School of Mathematical Sciences, University of Electronic Science and Technology of China, on June 1997-present. He is Director of Chinese Mathematical Biology Society, the chair of Biomathematics in Sichuan, Editors of Journal of Biomathematics. He has reviewed for many Journals, such as Journal of theory and application on control, Journal of Automation, Journal of Electronics, Journal of Electronics Science. His research interest is Stability Theorem and its Application research of the Differential System, the Robustness control, Neural network and Biomathematics.

Jinzhong Cui is a professor, whose main field is control theory technology and computer industry control. He is at the School of Computer Science and Engineering, University of Electronic Science and Technology of China.

Liyuan Hou was born on September 1986. She received B.S. degree from Leshan Normal University. She received Ph.D. degree from University of Electronic Science and Technology of China. She is currently a lecturer at College of Mathematics and Information Science, Leshan Normal College. Her research works focus on applied mathematics, the robustness control, neural network and information science.

Yuanyuan Li was born in Anhui, China. She received the B.S. degree from Fuyang Normal University, Fuyang, China, in 2013. Now she received the master degree in School of Mathematical Sciences at the University of Electronic Science and Technology of China, in 2016. Now she is working towards the Ph.D. degree in the School of Automation Engineering at the University of Electronic Science and Technology of China. Her current research interests include time-delay systems and adaptive control.

This work was financially supported by the Natural Science Foundation of China (No. 61533006) and Scientific Research Fund of Sichuan Provincial Department of Education, 17ZB0194.

View full text