Elsevier

Neurocomputing

Volume 129, 10 April 2014, Pages 476-481
Neurocomputing

Mean square exponential stability for discrete-time stochastic switched static neural networks with randomly occurring nonlinearities and stochastic delay

https://doi.org/10.1016/j.neucom.2013.09.011Get rights and content

Abstract

A class of discrete-time stochastic switched static neural networks model is presented with the introduction of randomly occurring nonlinearities and stochastic delay. The mean square exponential stability is investigated for such kind of neural networks. In terms of linear matrix inequality (LMI) approach, a delay-dependent stability criterion is derived for the considered neural networks via a Lyapunov–Krasovskii functional. An example with simulation results is given to illustrate the effectiveness of the theoretical result.

Introduction

Neural networks have been extensively studied, mostly because of their applications in image processing, pattern recognition, combinatorial optimization, fixed-point computations, and so on (see [1], [2], [3], [4], [5], and the references therein). According to whether neuron states (the external states of neurons) or local fields states (the internal states of neurons) are taken as basic variables, neural networks can be classified as static neural networks or local field neural networks [6], [7]. The static neural networks have been extensively adopted to deal with various optimization problems, for example, the linear variational inequality problem that contains linear and convex quadratic programming problems and linear complementary problems as special cases [8], [9]. Examples of oculomotor integrator and the head-direction system mean that the static neural networks and the local field neural networks are not always equivalent [10]. In the past few years, the stability analysis problem for the latter has been extensively studied [11]. However, it has litter attention for the former.

Switched systems, an important special class of hybrid dynamical systems, which consist of a finite number of subsystems described by differential or difference equations and a switching signal that orchestrates switching between these subsystems, have attracted considerable attention in the last decade because of their applications in both theory and practice. To describe the switching phenomena in neural networks, the so-called switched neural networks are proposed and the stability issues have been investigated in the continuous time settings [12], [13]. However, it is essential to formulate discrete-time analogues of the continuous-time systems when one wants to simulate or compute the continuous-time neural networks after obtaining its dynamical characteristics [14], [15]. Thus, the research on the stability properties of discrete-time neural networks is necessary [16], [17], [18], [19], [20]. Some initial stability results have just appeared, for example, in Wu et al. [21], Hou et al. [22] and Arunkumar et al. [23], for discrete-time switched neural networks. On the other hand, the synaptic transmission is a noisy process brought on by random fluctuation from the release of neuron transmitters and other probabilistic causes in real nervous systems [24], [25], [26], [27], [28]. It has also been known that a neural network could be stabilized or destabilized by certain stochastic inputs. So stochastic perturbation should be considered when investigating the stability problem of discrete-time neural networks.

Time delays are unavoidably encountered in the implementation of neural networks, and they may induce the undesirable dynamic network behaviors such as oscillation, instability or other poor performances. It is worth pointing out that when investigating discrete-time stochastic neural networks only the deterministic time-delay case was concerned [29]. Actually, the time delays in some neural networks are often existent in a stochastic fashion [30], [31], and their probabilistic characteristic, such as Bernoulli distribution, Poisson distribution or Normal distribution, can often be obtained by statistical methods. It often occurs in real systems that some values of the time delays are very large but the probabilities of the time delays taking such large values are very small. In this case, if only the variation range of time delays is employed to derive the criteria, the results may be somewhat more conservative. Hence, discrete-time neural networks with stochastic delays should be considered.

Nonlinearities are ubiquitous in practice, and it is well known that a large class of nonlinearities can be interpreted as the additive nonlinear exogenous disturbances caused by environmental circumstances. In today's networked environment, such nonlinear disturbances themselves may be subject to random abrupt changes, which may result from abrupt phenomena, such as random failures and repairs of the components, changes in the interconnections of subsystems, sudden environment changes, and modification of the operating point of a linearized model of a nonlinear systems [32]. In other words, the nonlinear disturbances may occur in a random way based on certain probabilities law, which are called as randomly occurred nonlinearities (RONs). When modelling the randomly occurring nonlinearities, the Bernoulli distribution model has been proven to be a flexible yet effective one that has been frequently employed. Recently, the concept of RONs has been introduced to model the randomly nonlinear functions in Liang et al. [33] for complex networks, in Shen et al. [32] and Wang et al. [34] for sensor networks, and in Hu et al. [35] for discrete stochastic systems. However, RONs are not taken into account when modelling discrete time neural networks.

Motivated by above discussion, a discrete-time stochastic switched static neural networks (DSSSNNs) model with RONs and stochastic delays simultaneously is proposed and the stability problem for it is concerned. To the author's knowledge, at present, this study is the first attempt to tackle the stability analysis problem for such kind of neural networks. In Section 2, the DSSSNNs model is formulated and some preliminaries are presented. A mean square exponential stability condition for the proposed DSSSNNs model is given in Section 3. In Section 4, a numerical example is provided to illustrate the effectiveness of the theoretical results. And finally, conclusions are drawn in Section 5.

Terminology: Let N+ stands for the set of nonnegative integers; R denotes the set of real numbers, Rn the n-dimensional Euclidean space, and Rn×m the set of all n×m real matrices. AT stands for the transpose of a matrix A and λmax(A) means the largest eigenvalue of A. I denotes identity matrix. [a:b] is a set involving all integers between a and b. For symmetric matrices X and Y, the notation X>Y (XY) means that the matrix XY is positive definite (nonnegative). diag{} denotes the block diagonal matrix and represents the elements below the main diagonal of a symmetric matrix. Moreover, let (Ω,F,{Ft}t0,P) a complete probability space with a filtration {Ft}t0 containing all P-null sets and being right continuous. E{·} denotes the mathematical expectation operator with respect to the given probability measure P. Denote by LF02([τ,0],Rn) the family of all F0-measurable C([τ,0]:Rn) valued random variables ϕ={ϕ(s),τs0} with the norm ϕ=supτs0|ϕ(s)|2<. In the sequel, if not explicitly stated, matrices are assumed to have compatible dimensions.

Section snippets

Model description and preliminaries

We propose the following model to represent n-neuron discrete-time stochastic static neural networks with RONs and time-varying delay:x(k+1)=Cx(k)+Bg(Wx(kd(k)))+α(k)Af(x(k))+σ(k,x(k))ω(k)x(j)=ϕ(j),j=d2,d2+1,,1,0,where x(k)=[x1(k),x2(k),,xn(k)]TRn is the neural state vector at time k, C=diag{c1,c2,,cn} is the state feedback coefficient matrix with |ci|<1; A,B,WRn×n are known real constant matrices, g(Wx(kd(k)))=[g1(W1x(kd(k))),g2(W2x(kd(k))),,gn(Wnx(kd(k)))]T is the neuron

Main results

For presentation convenience, in the following, we denoted10=d0d1+1,d20=d2d0+1,d12=d2d1+1.

Firstly, we consider the ith subsystem, that is, when ξ(k)=i, x(k+1)=Cix(k)+α(k)Aif(x(k))+β(k)Big(Wx(kd1(k)))+(1β(k))Big(Wx(kd2(k)))+σ(k,x(k))ω(k)

Theorem 1

Under Assumption 1, Assumption 2, Assumption 3, Assumption 4, the neural networks (6) are globally exponentially stable if there exist matrices Pi>0,Q1i>0,Q2i>0 and three scalars λk,k=1,2,3 such that the following linear matrix inequalities hold for any iI:

An illustrative example

One numerical example is presented to illustrate the effectiveness of the theoretical results. Let l=2.

Example

Consider the DSSSNNs (6) with the following parameters: C1=(0.1000.4),A1=(0.10.200.1),B1=(00.20.30.1),C2=(0.4000.2),A2=(0.10.20.30.1),B2=(00.20.10.5),W=(0.60.31.20.3),F1=(0.10.20.40.3),F2=(0.30.40.50.7),G1=(0.20.10.10.3),G2=(0.90.30.30.7),d1=1,d0=2,d2=5.The sector-bounded nonlinear functions are assumed to bef(x)=F1+F22+F2F12sin(x(k)),g(x)=G1+G22+G2G12sin(x(k)),and the time-varying

Conclusions

In this paper, we have investigated the mean square exponential stability problem for a class of discrete-time stochastic switched static neural networks model. The subsystem under investigation involves randomly occurring nonlinearities, stochastic disturbances, and stochastic delay. An effective linear matrix inequality approach has been proposed to derive the stability criterion. A numerical example has been given to show the effectiveness of the proposed method.

Acknowledgments

This work was jointly supported by the National Natural Science Foundation of China under Grant 10901073, 61272530, 11072059 and 11202084, the Postdoctoral Science Funds of China (2011M500082) and the Postdoctoral Fund of Jiangsu Province (1101076C), the Specialized Research Fund for the Doctoral Program of Higher Education under Grant no. 20110092110017, the Natural Science Foundation of Jiangsu Province of China under Grant no. BK2012741, and the Deanship of Scientific Research (DSR), King

Manfeng Hu received both the B.S. and M.S. degrees from Xuzhou Normal University, Xuzhou, China, and the Ph.D. degree from Jiangnan University, Wuxi, China, in 1998, 2001, and 2008, respectively. He was a Visiting Scholar at the Department of Mathematics, Michigan State University, USA from June 2008 to June 2009. Currently, he is a Postdoctoral Research Fellow at the School of Automation, Southeast University, Nanjing, China. Since July 2008, he has been Associate Professor at Jiangnan

References (38)

Cited by (34)

  • Mean square exponential stability of discrete-time Markov switched stochastic neural networks with partially unstable subsystems and mixed delays

    2021, Information Sciences
    Citation Excerpt :

    see [41] Compared with literature [16,25,26], they did not consider the neural network with switching signal. Compared with literature [28–31,40], even if they considered the influence of switching signal on the system, but the switching signal was deterministic.

  • Fractional delay segments method on time-delayed recurrent neural networks with impulsive and stochastic effects: An exponential stability approach

    2019, Neurocomputing
    Citation Excerpt :

    In accordance to the above reasons, stochastic neural networks have found fruitful applications in oncology, bio informatics, risk management and other similar fields. Therefore, the stochastic perturbations in neural networks become a fixate issue for stability analysis of recurrent neural networks, and recently the stability analysis problems for time-delayed stochastic neural networks have attracted broad interest, see for instants [15,40,59,76]. The authors in [45], studied the stochastic effects of state behavior for time-delayed neural networks.

  • Mean-square exponential input-to-state stability of stochastic recurrent neural networks with multi-proportional delays

    2017, Neurocomputing
    Citation Excerpt :

    So there is an important guiding significance to investigate the problem of stochastic delayed neural networks. Thankfully, there are abundant stability results about the stochastic neural networks with different kinds of delays, for instance, see [5–19] and references therein. For example, in [5], Li Dan and Zhu Quanxin introduced and proposed a new comparison principle to study the stochastic delayed neural networks.

View all citing articles on Scopus

Manfeng Hu received both the B.S. and M.S. degrees from Xuzhou Normal University, Xuzhou, China, and the Ph.D. degree from Jiangnan University, Wuxi, China, in 1998, 2001, and 2008, respectively. He was a Visiting Scholar at the Department of Mathematics, Michigan State University, USA from June 2008 to June 2009. Currently, he is a Postdoctoral Research Fellow at the School of Automation, Southeast University, Nanjing, China. Since July 2008, he has been Associate Professor at Jiangnan University. His current research interests include nonlinear dynamics, dynamics of complex networks and system biology.

Jinde Cao (M’07–SM’07) received the B.S. degree from Anhui Normal University, Wuhu, China, the M.S. degree from Yunnan University, Kunming, China, and the Ph.D. degree from Sichuan University, Chengdu, China, all in mathematics/applied mathematics, in 1986, 1989, and 1998, respectively. From March 1989 to May 2000, he was with the Yunnan University. In May 2000, he joined the Department of Mathematics, Southeast University, Nanjing, China. From July 2001 to June 2002, he was a Postdoctoral Research Fellow at the Department of Automation and Computer-Aided Engineering, Chinese University of Hong Kong, Hong Kong. In the period from 2006 to 2008, he was a Visiting Research Fellow and a Visiting Professor at the School of Information Systems, Computing and Mathematics, Brunel University, UK. Currently, he is a Professor and Doctoral Advisor at the Southeast University, prior to which he was a Professor at Yunnan University from 1996 to 2000. He is the author or coauthor of more than 300 journal papers and five edited books. His research interests include nonlinear systems, neural networks, complex systems and complex networks, stability theory, and applied mathematics. Dr. Cao was an Associate Editor of the IEEE Transactions on Neural Networks and Neurocomputing. He is an Associate Editor of the Abstract and Applied Analysis, Differential Equations and Dynamical Systems, Discrete Dynamics in Nature and Society, International Journal of Differential Equations, Journal of Applied Mathematics, Journal of the Franklin Institute, Mathematics and Computers in Simulation, and Neural Networks. Dr. Cao is a Reviewer of Mathematical Reviews and Zentralblatt-Math.

Aihua Hu received the B.S. degree in Information and Computing Science from the Jiangnan University, Wuxi, China, in 2003, and the M.S. degree and Ph.D. degree in Control Theory and Engineering from the Jiangnan University, Wuxi, China, in 2006 and 2010, respectively. Currently, she is an Associate Professor and Graduate Student Advisor at the Jiangnan University. She is the author or coauthor of about 20 journal papers. Her research interests include nonlinear systems, stochastic systems and complex networks.

View full text