Elsevier

Neurocomputing

Volume 179, 29 February 2016, Pages 126-134
Neurocomputing

Robust stability analysis for discrete-time neural networks with time-varying leakage delays and random parameter uncertainties

https://doi.org/10.1016/j.neucom.2015.11.069Get rights and content

Highlights

  • Robust stability problem for discrete-time neural network with leakage delays is investigated.

  • The random parameter uncertainties are characterized by Bernoulli distribution..

  • Based on LKF method, sufficient conditions for stability are given in terms of LMIs

  • Numerical simulations are exploited to illustrate the theoretical results.

Abstract

This paper is concerned with the problem of robust stability analysis for discrete-time neural networks with time-varying coupling delays, random parameter uncertainties and time-varying leakage delays. The uncertainties enter into the system parameters in a random way and such randomly occurring uncertainties obey certain mutually uncorrelated Bernoulli-distributed white noise sequences. The important feature of the results reported here is that the probability of occurrence of the parameter uncertainties are known a priori. Constructing suitable Lyapunov–Krasovskii functional (LKF) terms, sufficient conditions ensuring the stability of the discrete-time neural networks are derived in terms of linear matrix inequalities (LMIs). Finally, numerical examples are rendered to exemplify the effectiveness of the proposed results.

Introduction

Over the past decades, neural networks have received considerable research attention and successfully applied in different areas such as image processing, signal processing, fault diagnosis, pattern recognition, and so on. It is well-known that, both in biological and artificial neural networks, the interactions between neurons are generally asynchronous which inevitably result in time-delays that are often a source of undesirable complex dynamical behaviours such as instability, oscillation, chaotic and poor performance [12], [19], [23]. In electronic implementations of neural networks, the delays are usually time-varying due to the finite switching speed of amplifiers. In a neural network model, data or axon signal transmission is always accompanied by a non-zero interval delay between the initial and delivery time of messages or signals. Interval time-varying delay is a time-delay that varies in an interval in which the lower bound is not restricted to be 0. A typical example of dynamical systems with interval time-varying delay is networked control systems [3]. Further, the time-delay in the leakage term has a great impact on the dynamical behaviour of neural networks. It is known from the literature on population dynamics [2] that time-delays in the stabilizing negative feedback terms have a tendency to destabilize a system. Since time-delays in the leakage terms are usually not easy to handle, such delays have been rarely considered in the neural network literature so far. To practice, the leakage time-delay is not a constant, so we ought to consider the model with the time-varying leakage delay. Thus, the stability analysis of neural networks with time-varying delay in leakage term has primary significance in the research of neural networks.

Further, it is well-known that in practical situations, uncertainties have great impact on the performance of the neural networks. In neural networks, the connection weights of the neurons depend on certain resistance and capacitance values that include modeling errors or uncertainties [9]. The deviations and perturbations in parameters are the main sources of uncertainty. The problem of exponential stability has been studied for a class of discrete-time stochastic neural networks with time-varying delay and norm-bounded uncertainties [11]. The network-induced phenomena would lead to abrupt structural and parametric changes in practical engineering applications. The parameter uncertainties may be subject to random changes in environmental circumstances, for instance, network-induced random failures and repairs of components, changing subsystem interconnections, sudden environmental disturbances, etc. These parameter uncertainties may occur in a probabilistic way with certain types and intensity. Hence, it is significant to consider the random parameter uncertainties when designing the networked systems, for example, see [6], [8].

Recently, stability analysis problem for stochastic neural networks has been investigated and less conservative results have been derived using probability distribution of time-varying delay [5]. State estimation problem for neural networks with Markovian jumps has been studied in [18], [22]. The stability of stochastic neural networks with constant time-delay in leakage term has been investigated in [14]. Although the stability analysis of neural networks has received much attention, so far very few results have been reported on the stability analysis of discrete-time neural networks with time-delays in leakage (or “forgetting”) term. Stochastic disturbances are mostly inevitable owing to noise in electronic implementations and certain stochastic inputs could make a neural network unstable. The stability of stochastic discrete-time neural networks with discrete time-varying delays and leakage delay has been well investigated in [4, 15]. Authors in [14] have presented a robust analysis approach to stochastic stability of the uncertain Markovian jumping discrete-time neural networks with time-delay in the leakage term. A delay-dependent robust synchronization analysis has been studied for coupled stochastic discrete-time neural networks with interval time-varying delays in network coupling, a time-delay in leakage term and parameter uncertainties [17]. Robust stability problem for discrete-time uncertain neural networks with time-varying leakage delays has been considered in [7]. Unfortunately, to the best of authors׳ knowledge, stability analysis of discrete-time neural networks with time-varying network coupling delay, time-varying leakage delay and random parameter uncertainties has not been investigated yet. Thus, the proposed neural networks model and its applications are closed to the practical networks.

Motivated by the above, in this paper we aim to establish the robust stability conditions for a class of discrete-time neural network systems with time-varying leakage delay and randomly occurring uncertainties. The Lyapunov–Krasovskii functional (LKF) is chosen to be augmented type which utilizes more information about the system. It is worth pointing out that in this work LKF is developed with more decision variables to exploit the information of both the lower and upper bounds of the time-varying transmission and leakage delays. A new set of zero equations are added to the derivative of the LKF and employing reciprocal convex lemma to the augmented terms, sufficient stability conditions are established in terms of LMIs. The derived stability conditions are depended on the lower and upper bounds of the transmission delay as well as the leakage delay. The feasibility of derived criteria can be easily checked by resorting to Matlab LMI Toolbox. Finally, numerical examples are included to illustrate the effectiveness of the proposed results.

This paper is organized as follows. Problem formulation and preliminaries are given in Section 2. Section 3 gives the sufficient stability conditions for the discrete-time neural networks system. Numerical examples are demonstrated in Section 4 to illustrate the effectiveness of the proposed method. Finally, conclusions are drawn in Section 5.

Notations: Throughout this paper, Rn and Rn×n denote the n-dimensional Euclidean space and the set of all n×n real matrices, respectively. The superscript T and (−1) denote the matrix transposition and matrix inverse, respectively. Matrices, if they are not explicitly stated, are assumed to have compatible dimensions. · is the Euclidean norm in Rn. I is an identity matrix with appropriate dimension. E denotes the mathematical expectation.

Section snippets

Problem description and preliminaries

Consider the following discrete-time delayed neural networks system with time-varying leakage delayy(k+1)=Ay(kσ(k))+Bf^(y(k))+Cg^(y(kτ(k)))+Jwhere y(·)=[y1(·),,yn(·)]Rn is the state vector, f^(·)=[f^1(·),,f^n(·)]TRn, g^(·)=[g^1(·),,g^n(·)]TRn denote the activation functions, J=[J1,,Jn]T is the external input vector. σ(k) represents the leakage delay satisfying 0<σmσ(k)σM, where σm and σM denote the lower and upper bounds of σ(k). τ(k) describes the transmission delay satisfying 0<τmτ

Main results

In this section, we aim to provide a delay dependent sufficient conditions which ensure the asymptotic stability of the discrete-time neural networks system (2) with leakage delay.

Theorem 3.1

Under Assumption 1, the neural networks system (2) is globally asymptotically stable, if there exist symmetric matrices Pi>0i=1to8, Q1>0, Q2>0, R1>0, R2>0, Sj>0, Uj>0, diagonal matrices Λ1>0, Λ2>0, and matrices Tj(j=1to4), W1, W2 of appropriate dimensions, such that the following LMIs hold.[S1S2S3S4]>0,[U1U2U3U4]>0,Φ=

Numerical examples

In this section, numerical examples will be given to substantiate the main results.

Example 4.1

Consider a uncertain discrete-time neural networks system with the following parameters:A=[0.9000.8],B=[1.401.90.5],C=[1.10.40.10.08],Ga=[0.060.0200.02],Gb=[0.0800.020.04],Gc=[0.010.030.020.02],E=[0.05000.012],F(k)=[sin(k)00cos(k)].Assume α=0.5, β=0.6, γ=0.3. The nonlinear functions are given as f(x(k))=[tanh(0.2x1(k))+0.2tanh(0.2x2(k))+0.1],g(x(k))=[tanh(0.2x1(k))tanh(0.2x2(k))].The activation functions

Conclusions

In this paper, we have investigated the robust stability problem for discrete-time neural networks system with time-varying leakage delay and randomly occurring uncertainties. Based on Lyapunov functional approach, sufficient delay-dependent conditions have been established to ensure the robust stability of the considered neural networks system in terms of LMIs, which can be effectively checked by the powerful LMI Toolbox in MATLAB. Numerical simulations have been exploited to illustrate the

Acknowledgments

This research work of L. Jarina Banu is supported by University Grants Commission – Maulana Azad National Fellowship (UGC-MANF), New Delhi, India under the grant no. F1-17.1/2011/MANF-MUS-TAM-6592/ (SA-III/Website)/ dt. 02/01/2012. The authors are grateful to the anonymous reviewers for their insightful comments and constructive suggestions to improve the quality of the manuscript.

L. Jarina Banu graduated and post graduated in Mathematics from Gandhigram Rural Institute-Deemed University, Tamilnadu, India, in 2008 and 2011, respectively. Currently under Maulana Azad National Fellowship of University Grants Commission, Government of India, New Delhi, she is pursuing Ph.D degree under the supervision of Prof. P. Balasubramaniam in the Department of Mathematics, Gandhigram Rural Institute-Deemed University, India. To her credit, she has published 9 research articles in SCI

References (24)

Cited by (34)

  • Reliable filter design for discrete-time neural networks with Markovian jumping parameters and time-varying delay

    2020, Journal of the Franklin Institute
    Citation Excerpt :

    It should be mentioned that the stability criterion developed in [21] is less conservative than those in [14,15]. Nevertheless, it should be noted that the results obtained in [14,15,21] are sufficient conditions, and thus they could be further improved. The main difficult is how to deal with the time-varying delay term in order to obtain less conservative stability criteria than those existing results.

  • A Granular Functional Network with delay: Some dynamical properties and application to the sign prediction in social networks

    2018, Neurocomputing
    Citation Excerpt :

    (13) can be regarded as a special case of the one considered in [19], with no leakage delay, the identity function to activate u(i), and no external additional terms are present. Since the functions Ai are continuous and bounded in the considered domain, then Theorem 3.1 in [19] can be adapted to state the global asymptotic stability of the proposed network model. According to Assumption 1, we can consider the network as an ensemble of neurons pairs.

View all citing articles on Scopus

L. Jarina Banu graduated and post graduated in Mathematics from Gandhigram Rural Institute-Deemed University, Tamilnadu, India, in 2008 and 2011, respectively. Currently under Maulana Azad National Fellowship of University Grants Commission, Government of India, New Delhi, she is pursuing Ph.D degree under the supervision of Prof. P. Balasubramaniam in the Department of Mathematics, Gandhigram Rural Institute-Deemed University, India. To her credit, she has published 9 research articles in SCI journals. Her research interest is in the area of stability analysis of discrete dynamical systems including switched systems, fuzzy systems and complex networks.

P. Balasubramaniam post graduated in the year 1989 and subsequently completed Master of Philosophy in the year 1990 and Doctor of Philosophy (Ph.D.) in 1994 in the field of Mathematics with specialized area of Control Theory from Bharathiar University, Coimbatore, Tamilnadu, India. Soon after his completion of Ph.D. degree, he served as Lecturer in Mathematics in Engineering Colleges for three years. Since February 1997 he served as Lecturer and Reader in Mathematics and now he is rendering his services as a Professor, Department of Mathematics, Gandhigram Rural University, Gandhigram, India, from November 2006 onwards. He has worked as a Visiting Research Professor during the years 2001 and 2005–2006 for promoting research in the field of control theory and neural networks at Pusan National University, Pusan, South Korea. Also he has worked as Visiting Professor in the Institute of Mathematical Sciences, University of Malaya, Malaysia, for the period of six months from September 2011 to March 2012. He is a member of several academic bodies including a life member of Cryptology Research Society of India, Indian Statistical Institute, Kolkata. He has 23 years of experience in teaching and research. He has published more than 211 research papers in various SCI journals holding impact factors with Scopus H-index 29 and web of knowledge H-Index 23. Also he has edited 7 proceedings including a book and 3 international conference proceedings in Springer publications. He is serving as a reviewer of many SCI journals and member of the editorial board of Journal of Computer Science, Advances in Fuzzy Sets and Systems and the Scientific World Journal: Mathematical Analysis, Hindawi Publisling Corporation, USA. He is an Editor-in-Chief of the journal Modern Instrumentation, Scientific Research Publishing Inc. (SCIRP) and Associate Editor of Advances in Difference Equations, Springer, Germany. He has received the Tamilnadu Scientist Award (TANSA) for the discipline of Mathematical Sciences instituted by the Tamilnadu State Council for Science and Technology in the year 2005. His research interest includes the areas of Control theory, Stochastic differential equations, Soft Computing, Stability analysis, Cryptography, Neural Networks and image processing.

View full text