Elsevier

Neural Networks

Volume 54, June 2014, Pages 112-122
Neural Networks

A systematic method for analyzing robust stability of interval neural networks with time-delays based on stability criteria

https://doi.org/10.1016/j.neunet.2014.03.002Get rights and content

Abstract

This paper presents a systematic method for analyzing the robust stability of a class of interval neural networks with uncertain parameters and time delays. The neural networks are affected by uncertain parameters whose values are time-invariant and unknown, but bounded in given compact sets. Several new sufficient conditions for the global asymptotic/exponential robust stability of the interval delayed neural networks are derived. The results can be casted as linear matrix inequalities (LMIs), which are shown to be generalizations of some existing conditions. Compared with most existing results, the presented conditions are less conservative and easier to check. Two illustrative numerical examples are given to substantiate the effectiveness and applicability of the proposed robust stability analysis method.

Introduction

Recurrent neural networks have been applied for various applications in many fields such as image and signal processing, pattern recognition, optimization, associative memory, control, and modelling. In such applications, it is crucial to ensure the stability of designed neural networks. For example, when a neural network is designed as an optimization solver, the foremost prerequisite is to guarantee that the neural network is globally asymptotically stable. In recent years, dynamics of recurrent neural networks have been widely studied (Arik, 2002, Cao and Wang, 2003, Cao and Wang, 2005a, Cao and Wang, 2005b, Cao and Zhou, 1998, Forti and Tesi, 1995, Gao et al., 2013, He et al., 2007, Liao et al., 2002, Mahmouda and Xia, 2011, Shen and Wang, 2007, Shen and Wang, 2008, Shen and Wang, 2012, Wang et al., 2005, Wu et al., 2012, Yu et al., 2007, Zeng and Wang, 2006a, Zeng and Wang, 2006b, Zeng et al., 2003, Zeng et al., 2004, Zeng et al., 2005, Zhang et al., 2010, Zhang et al., 2008).

To analyze dynamical properties of recurrent neural networks, it is sometimes necessary to take account of time delays. Due to finite switching speed of amplifiers and communication speed between the neurons, time delays are likely to be present in the electronic implementations of neural networks. Time delays can change the dynamics of a network, such as inducing a network to exhibit oscillation or other unstable behaviors (Baldi & Atiya, 1994). Moreover, in deterministic neural networks, vital data such as the neuron fire rate and the synaptic interconnection weights are usually measured, acquired, and processed by means of statistical estimation. As a result, estimation errors are unavoidable. Furthermore, inevitable external disturbances and parameter perturbations result in additional uncertainties for neural network models. In light of the above discussions, it is important to study the robust stability of recurrent neural networks with time delays in presence of uncertainties. There are mainly two forms of uncertainties; namely, interval uncertainty and norm-bounded uncertainty. For interval neural networks with time delays, there existed many delay-dependent or delay-independent robust stability criteria (Arik, 2003, Bao et al., 2012, Cao and Chen, 2004, Cao et al., 2005, Chen et al., 2005, Ensari and Arik, 2010, Faydasicok and Arik, 2012a, Faydasicok and Arik, 2012b, Faydasicok and Arik, 2013, Guo and Huang, 2009, Liao and Yu, 1998, Li et al., 2004, Ozcan and Arik, 2006, Qi, 2007, Shao et al., 2010, Singh, 2005, Wu et al., 2011, Zhang et al., 2007). A brief review on these works reveals that there are mainly two types of methods. One method is to find an upper bound for the norm of the interval matrices, and then apply it to the robust stability analysis of neural networks with time delays; e.g., Ensari and Arik (2010), Faydasicok and Arik, 2012a, Faydasicok and Arik, 2012b, Faydasicok and Arik, 2013, Guo and Huang (2009), Ozcan and Arik (2006), Qi (2007), Shao et al. (2010) and Singh (2007). In particular, Faydasicok and Arik in  Faydasicok and Arik (2013) recently proposed a new upper bounded for the norm of interval matrices and addressed some robust stability criteria which are shown to be weaker than previous results in Ensari and Arik (2010), Faydasicok and Arik, 2012a, Faydasicok and Arik, 2012b, Ozcan and Arik (2006), Qi (2007), Shao et al. (2010) and Singh (2007), under some cases. Moreover, they made a summary of the results in Ensari and Arik (2010), Faydasicok and Arik, 2012a, Faydasicok and Arik, 2012b, Faydasicok and Arik, 2013, Ozcan and Arik (2006), Qi (2007), Shao et al. (2010) and Singh (2007), and obtained the unified results on the robust stability. Another method is to use the absolute value of the upper or lower bounds of the network parameters to ascertain the robust stability of interval delayed neural networks; e.g.,  Cao and Chen (2004) and Chen et al. (2005). A disadvantage of these two methods is that the differences between the neuronal excitatory and the inhibitory effects are without taking into account fully.

In this paper, we propose a new method for analyzing robust stability of a general class of interval neural networks with time delay and uncertain parameters whose values are unknown but bounded in given compact sets. Based on existing stability criteria of delayed neural networks with deterministic parameters, this method can be used to derive new sufficient conditions for the global robust asymptotic/exponential stability of uncertain interval delayed neural networks. More importantly, the robust stability criteria herein are less conservative or restrictive than the ones given in Ensari and Arik (2010), Faydasicok and Arik, 2012a, Faydasicok and Arik, 2012b, Faydasicok and Arik, 2013, Guo and Huang (2009), Ozcan and Arik (2006), Qi (2007), Shao et al. (2010) and Singh (2007). Moreover, they take into account the differences between the neuronal excitatory and the inhibitory effects.

The reminder of this paper is organized as follows. In Section  2, interval neural networks with time delays are described. In Section  3, a new method for analyzing the robust stability of interval delayed neural networks is proposed. In Section  4, one more set of criteria for determining the global asymptotic/exponential robust stability of interval neural networks with time delays are derived by applying this method to existing stability criteria of delayed neural networks with deterministic parameters. In Section  5, we make a comparison with previous robust stability results. Moreover, two illustrative and comparative numerical examples are given to show the effectiveness and applicability of the proposed robust stability conditions. Finally, concluding remarks are made in Section  6.

Section snippets

Preliminaries

In this section, we shall formulate the problem discussed in the paper, and introduce some notations and definitions.

Notations: Let x=(x1,x2,,xn)TRn and A=(aij)n×nRn×n. x2 are 2-norm of column vector x, which are defined as: x2=i=1nxi2.A2 is the induced norm of corresponding vector norm and are represented as: A2=[λmax(ATA)]1/2.|A| is denoted as |A|=(|aij|)n×n. If A is a symmetric matrix, then A>0 (0,<0) implies that A is positive definite (positive semi-definite, negative

A method for analyzing robust stability

In this section, we shall address a new method for analyzing the robust stability of interval delayed neural networks based on the stability criteria of corresponding deterministic delayed neural networks.

For convenience, we rewrite the parameters of DRNN (1), i.e.,  d1,d2,,dn, a11,a12,,a1n,,an1, an2,,ann, b11,b12,,b1n,,bn1,bn2,,bnn, as the following form: α1,α2,,αm, which satisfy that αi[α¯i,α¯i],i=1,2,,m=2n2+n.

As stated in introduction, there have existed a lot of criteria on the

Robust stability criteria

In this section, by applying the new method for analyzing the robust stability to existing stability criteria of DRNN (1) with deterministic parameters, the robust stability criteria of DRNN (1) satisfying (4) will be presented by scale and matrix forms, respectively.

Comparison and examples

In this section, by further analyzing the results and giving some examples, we will compare the results with the existing results in the literature.

For convenience, we introduce the following notations: A=12(A¯+A¯),A=12(A¯A¯),B=12(B¯+B¯),B=12(B¯B¯),σ1(B)=|BTB|+2|BT|B+BTB2,σ2(B)=B2+B2,σ3(B)=B22+B22+2BT|B|2,σ4(B)=|B|max2.

Next, we present some important lemmas, which is used to compare our results with that in the previous literature.

Lemma 12

Faydasicok & Arik, 2013

Let B be any real matrix defined

Concluding remarks

This paper presents a systematic method for analyzing the robust stability of interval recurrent neural networks. By utilizing existing algebraic stability criteria of deterministic neural networks, the method provides a general tool to derive robust stability criteria of interval neural networks with uncertain parameters. Specifically, using the proposed method, several new sufficient conditions are obtained for global asymptotic/exponential robust stability of interval neural networks. The

References (51)

  • Z. Guo et al.

    LMI conditions for global robust stability of delayed neural networks with discontinuous neuron activations

    Applied Mathematics and Computation

    (2009)
  • X. Liao et al.

    Delay-dependent exponential stability analysis of delayed neural networks: an LMI approach

    Neural Networks

    (2002)
  • C. Li et al.

    Global robust asymptotical stability of multi-delayed interval neural networks: an LMI approach

    Physics Letters A

    (2004)
  • J.-L. Shao et al.

    Some improved criteria for global robust exponential stability of neural networks with time-varying delays

    Communications in Nonlinear Science and Numerical Simulation

    (2010)
  • V. Singh

    Global robust stability of delayed neural networks: estimating upper limit of norm of delayed connection weight matrix

    Chaos, Solitons and Fractals

    (2007)
  • Z. Wang et al.

    On global asymptotic stability of neural networks with discrete and distributed delays

    Physics Letters A

    (2005)
  • S. Wen et al.

    Exponential stability analysis of memristor-besed recurrent neural networks with time-varying delays

    Neurocomputing

    (2012)
  • A. Wu et al.

    Dynamic behaviors of memristor-based recurrent neural networks with time-varying delays

    Neural Networks

    (2012)
  • W. Yu et al.

    An lmi approach to global asymptotic stability of the delayed cohen-grossberg neural network via nonsmooth analysis

    Neural Networks

    (2007)
  • Z. Zeng et al.

    Global exponential stability of recurrent neural networks with time-varying delays in the presence of strong external stimuli

    Neural Networks

    (2006)
  • G. Zhang et al.

    Global exponential stability of a class of memristor-based recurrent neural networks with time-varying delays

    Neurocomputing

    (2012)
  • G. Zhang et al.

    Global exponential periodicity and stability of a class of memristor-based recurrent neural networks with multiple delays

    Information Sciences

    (2013)
  • S. Arik

    An analysis of global asymptotic stability of delayed cellular neural networks

    IEEE Transactions on Neural Networks

    (2002)
  • S. Arik

    Global robust stability of delayed neural networks

    IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications

    (2003)
  • P. Baldi et al.

    How delays affect neural dynamics and learning

    IEEE Transactions on Neural Networks

    (1994)
  • Cited by (23)

    View all citing articles on Scopus

    The work described in the paper was supported by the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. CUHK416811E), Hong Kong Scholars Program, and National Natural Science Foundation of China (11101133).

    View full text