Elsevier

Neural Networks

Volume 20, Issue 10, December 2007, Pages 1067-1080
Neural Networks

Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays

https://doi.org/10.1016/j.neunet.2007.07.007Get rights and content

Abstract

The paper presents theoretical results on the global exponential periodicity and global exponential stability of a class of recurrent neural networks with various general activation functions and time-varying delays. The general activation functions include monotone nondecreasing functions, globally Lipschitz continuous and monotone nondecreasing functions, semi-Lipschitz continuous mixed monotone functions, and Lipschitz continuous functions. For each class of activation functions, testable algebraic criteria for ascertaining global exponential periodicity and global exponential stability of a class of recurrent neural networks are derived by using the comparison principle and the theory of monotone operator. Furthermore, the rate of exponential convergence and bounds of attractive domain of periodic oscillations or equilibrium points are also estimated. The convergence analysis based on the generalization of activation functions widens the application scope for the model design of neural networks. In addition, the new effective analytical method enriches the toolbox for the qualitative analysis of neural networks.

Introduction

Periodic oscillation in recurrent neural networks is a very interesting dynamic behavior as many biological and cognitive activities (e.g., heartbeat, respiration, mastication, locomotion, and memorization) exhibit periodicity. Persistent oscillation, such as limit cycles, represents a common feature of neural firing patterns produced by dynamic interplay between cellular and synaptic mechanisms. Stimulus-evoked oscillatory synchronization was observed in many biological neural systems including the cerebral cortex of mammals and the brain of insects. In addition, periodic oscillations in recurrent neural networks have found many applications such as associative memories (Nishikawa, Lai, & Hoppensteadt, 2004), pattern recognition (Wang, 1995), machine learning (Ruiz, Owens, & Townley, 1998), and robot motion control (Jin & Zacksenhouse, 2003).

In both biological and artificial neural systems, time delays due to integration and communication are ubiquitous, and often become a source of instability. The time delays in electronic neural networks are usually time-varying, and sometimes vary violently with respect to time due to the finite switching speed of amplifiers and faults in the electrical circuitry. They slow down the transmission rate and tend to introduce some degree of instability in circuits. It was also known that time delays can cause oscillations in neurodynamics (Belair et al., 1996, Gopalsamy and Leung, 1996). Therefore, it is important to investigate the periodicity of recurrent neural networks with time delays.

In recent years, studies of the periodic oscillation of various neural networks are reported in Cao and Wang, 2000, Cao and Wang, 2002, Chen and Wang (2004), Chen, Wang, and Liu (2000), Dong (2002), Fang and Li (2000), Guo and Huang (2003), Guo, Huang, Dai, and Zhang (2003), Jiang, Li, and Teng (2003), Liu and Liao (2004), Liu, Chen, Cao, and Huang (2003), Liu, Chen, and Huang (2004), Rehim, Jiang, and Teng (2004), Townley et al. (2000), Wang and Zou (2004), Yang and Dillon (1994). Any equilibrium point can be viewed as a special case of periodic solution with an arbitrary period or zero amplitude. In this sense, the analysis of periodic oscillation of neural networks is more general than the stability analysis of equilibrium points. There are numerous results on the stability analysis of recurrent neural networks. For example, the results on the existence, uniqueness, and global stability of the equilibria of the recurrent neural network model (2) with and without time delay are discussed in Arik and Tavsanoglu (2000), Cao and Wang (2003), Chen and Wang (2004), Civalleri, Gilli, and Pandolfi (1993), Cohen and Grossberg (1983), Grossberg, 1968a, Grossberg, 1968b, Grossberg, 1969a, Grossberg, 1969b, Grossberg (1971), Grossberg (1988), Hirsch (1989), Hopfield (1984), Hu and Wang (2002b), Liang, X. and Wang, J. (2000), Liang and Wang (2001), Liang and Si (2001), Liao and Wang (2000), Liao and Wang (2003), Takahashi (2000), Wang and Zou (2002), Xia and Wang (2001b), Zeng, Wang, and Liao (2003) and the references cited therein.

In conducting a periodicity or stability analysis of a neural network, the conditions to be imposed on the neural network are determined by the characteristics of activation function as well as network parameters. When neural networks are designed for problem solving, it is desirable for their activation functions to be general. In many electronic circuits, amplifiers, which may generally have continuous input–output functions, can be adopted. To facilitate the design of neural networks, it is important that the neural networks with general activation functions are studied. The generalization of activation functions will provide a wider scope for neural network designs and applications.

The basis of attraction of a periodic oscillation or equilibrium point depends on the parameters and activation function of the neural network. It is also highly desirable to obtain the bounds of the attractive domain of a periodic oscillation or equilibrium point. However, it has not been fully explored in the existing studies. Therefore, the prior estimation of the attractors of periodic oscillations and equilibrium points of neural networks with time-varying delays deserves in-depth investigations.

Consider the model of a class of recurrent neural networks (DRNNs) with varying-time delays of the form dxi(t)dt=dixi(t)+gi(j=1nwijxj(tτij(t))+ui),i=1,,n; where xi(t) denotes the state variable associated with the ith neuron, di>0 is a constant, W=(wij)Rn×n is a connection weight matrix, u=(u1,,un)T denotes the external input vector that can be periodic or constant, τij(t) is a non-negative function representing the finite speed of the axonal signal transmission, and gi() is an activation function for i=1,2,,n.

When τij(t)0, the special case of model (1) becomes the recurrent neural networks (RNNs), which can be found in Grossberg (1988), Hirsch (1989) and applied in many fields such as optimization and robotics; e.g., Liang, X.B. and Wang, J. (2000), Liang and Wang (2001), Liang and Si (2001), Xia and Wang (2000), Xia and Wang (2001a), Xia, Leung, and Wang (2002), Xia and Wang, 2004a, Xia and Wang, 2004b, Zhang, Wang, and Xu (2002), Zhang, Wang, and Xia (2003). It includes a number of models from neurobiology, population biology, evolutionary theory with the appropriate assumptions. Due to the finite speeds of switching and transmission of signals, time delays unavoidably exist in a working network and thus should be incorporated into the model. If τij(t)τi(t),WD=DW and W is nonsingular, where D=diag(d1,,dn), by using yi(t)=j=1nwijxj(t)+ui(i=1,,n), DRNN model (1) can be transformed to the additive model of DRNN (Grossberg, 1988) dyi(t)dt=diyi(t)+j=1nwijgj(yj(tτij(t)))+Ji,i=1,,n, where Ji=diui. The dynamics of such an additive DRNN model have been widely investigated in recent years and many important results have been obtained. All of these existing results for model (2) can be applied to the model (1) when τij(t)τi(t),DW=WD,D=diag(d1,,dn), and W is nonsingular. However, when τij(t)τi(t),DWWD or W is singular, whether or not the existing results for DRNN model (2) are effective for DRNN model (1) needs to be explored since the two models are not equivalent in general. Thus, it is important for DRNN model (1) to be analyzed directly.

In this paper, we consider DRNN model (1) with an activation function belonging to any of four classes of the continuous functions including monotone nondecreasing functions, globally Lipschitz continuous and monotone nondecreasing functions, semi-Lipschitz continuous mixed monotone functions, and Lipschitz continuous functions. All activation functions are possibly unbounded functions. For various activation functions, we will establish conditions for ascertaining the convergence of DRNN model (1) to unique periodic oscillation trajectories or equilibria within an estimated attractor at an estimated convergence rate.

The remainder of the paper is organized in six sections. Section 2 provides preliminary informations. Sections 3 Monotone activation functions, 4 Non-monotone activation functions present the criteria for the global exponential periodicity of DRNN model (1) with monotone activation function and non-monotone activation function, respectively. As special cases of the results in Sections 3 Monotone activation functions, 4 Non-monotone activation functions, Section 5 discusses the conditions of global exponential stability of equilibrium point of DRNN model (1) with constant inputs. In Section 6, two numerical examples are given along with simulation results. Finally, Section 7 concludes the paper.

Section snippets

Preliminaries

In this section, we will introduce some definitions and notations which will be needed later.

In the following definitions, let L,M,LM,MM and SM denote the set of the Lipschitz continuous functions, the set of the monotone nondecreasing functions, the set of the Lipschitz continuous and the set of the monotone nondecreasing functions, the set of the mixed monotone functions, and the set of the semi-Lipschitz continuous and mixed monotone functions, respectively.

Definition 1

Let g(x)=(g1(x1),,gn(xn))T for x

Monotone activation functions

In this section, we consider DRNN model (1) with activation function gM or gLM. First, DRNN model (1) can be embedded into the following 2n-dimensional coupled system i=1,,n; {η̇i(t)=diηi(t)+gi(wiiηi(t)+j=1,jin[wij+ηj(tτij(t))wijξj(tτij(t))]+ui(t)),ξ̇i(t)=diξi(t)+gi(wiiξi(t)+j=1,jin[wij+ξj(tτij(t))wijηj(tτij(t))]+ui(t)), with the initial states η(s)=φ(s),ξ(s)=ψ(s),s[τ,0],φ,ψCτ. Let (ηT(t,φ,ψ),ξT(t,φ,ψ))T and (ηtT(φ,ψ),ξtT(φ,ψ))T denote solutions of system (10) in the product

Non-monotone activation functions

In this section, we consider DRNN model (1) with its activation functions gSM or gL.

Let gMM and let γ(u,v)=(γ1(η1,ξ1),,γn(ηn,ξn))T be a mixed monotone representation of g(x). We embed the model (1) into the following 2n-dimensional coupled system, i=1,,n; {η̇i(t)=diηi(t)+γi(wiiηi(t)+hi(ηt,ξt)+ui(t),wiiηi(t)+hi(ξt,ηt)+ui(t))ξ̇i(t)=diξi(t)+γi(wiiξi(t)+hi(ξt,ηt)+ui(t),wiiξi(t)+hi(ηt,ξt)+ui(t)) where hi(ηt,ξt)=j=1,jinwij+ηj(tτij(t))j=1,jinwijξj(tτij(t)). Similar to Lemma 1, Lemma 2,

Global exponential stability

In this section, we consider the global exponential stability of DRNN model (1) with constant inputs; i.e., dxi(t)dt=dixi(t)+gi(wiixi(t)+j=1,jinwijxj(tτij(t))+ui),i=1,,n; where u=(u1,,un)TRn is a constant input vector, τij(t)(i,j=1,,n) are any non-negative bounded functions.

Firstly, note that system (62) has the same equilibrium as the following constant delayed system dxi(t)dt=dixi(t)+gi(wiixi(t)+j=1,jinwijxj(tτij)+ui),i=1,,n. Secondly, when u and τij are all constants (i.e.,

Illustrative examples

In this section, we discuss two numerical examples to illustrate the results.

Example 1

Consider a DRNN model (1) with n=2, where the activation functions g1(σ)=g2(σ)=σ13. Note that gi()M(i=1,2). Let W(wij)=(10.512). According to Theorem 1, DRNN model (1) is globally exponentially convergent to a unique ω-periodic solution x(t) for any di>0(i=1,2), all non-negative ω-periodic time delays τij(t),i,j=1,2 and any ω-periodic oscillation input vector u(t)=(u1(t),u2(t))T. Specifically, let d1=d2=1,u1(t)=u

Concluding remarks

The paper presents analytical results on global exponential periodicity and global exponential stability of a general class of recurrent neural networks with time-varying delays and various activation functions (e.g., monotone nondecreasing functions, globally Lipschitz continuous and monotone nondecreasing functions, semi-Lipschitz continuous mixed monotone functions, and Lipschitz continuous functions). By using the comparison principle, the theory of monotone operator, algebraic criteria are

Acknowledgement

The first author is grateful for the financial support from the Education Department of Hubei Province (Z200622002), China.

References (56)

  • M. Rehim et al.

    Boundedness and stability for nonautonomous cellular neural networks with delay

    Neural Networks

    (2004)
  • H. Smith

    Monotone semiflows generated by functional differential equations

    Journal of Differential Equations

    (1987)
  • L. Wang et al.

    Exponential stability of Cohen–Grossberg neural networks

    Neural Networks

    (2002)
  • Y. Xia et al.

    A recurrent neural network for solving linear projection equations

    Neural Networks

    (2000)
  • S. Arik et al.

    On the global asymptotic stability of delayed cellular neural networks

    IEEE Transactions on Circuits and Systems—Part I: Fundamental Theory and Applications

    (2000)
  • J. Belair et al.

    Frustration, stability and delay-induced oscillation in a neural network model

    SIAM Journal on Applied Mathematics

    (1996)
  • J. Cao et al.

    Periodic oscillatory solution of bidirectional associative memory networks with delays

    Physical Review E

    (2000)
  • J. Cao et al.

    Exponential stability and periodic oscillatory solution in BAM networks with delays

    IEEE Transactions on Neural Networks

    (2002)
  • J. Cao et al.

    Global asymptotic stability of a general class of recurrent neural networks with time-varying delays

    IEEE Transactions on Circuits and Systems—Part I: Fundamental Theory and Applications

    (2003)
  • B. Chen

    Mixed monotone semiflows and stability for functional differential equations

    Acta Mathematica Sinica

    (1995)
  • K. Chen et al.

    Weight adaptation and oscillatory correlation for image segmentation

    IEEE Transactions on Neural Networks

    (2000)
  • P.P. Civalleri et al.

    On stability of cellular neural networks with delay

    IEEE Transactions on Circuits and Systems—Part I: Fundamental Theory and Applications

    (1993)
  • M.A. Cohen et al.

    Absolute stability and global pattern formation and parallel memory storage by competitive neural networks

    IEEE Transactions on Systems, Man and Cybernetics

    (1983)
  • H. Fang et al.

    Global exponential stability and periodic solutions of cellular neural networks with delay

    Physical Review E

    (2000)
  • S. Grossberg

    Some nonlinear networks capable of learning a spatial pattern of arbitrary complexity

    Proceedings of the National Academy of Sciences

    (1968)
  • S. Grossberg

    On learning and energy–entropy dependence in recurrent and nonrecurrent signed networks

    Journal of Statistical Physics

    (1969)
  • S. Grossberg

    Learning and energy–entropy dependence in some nonlinear functional-differential systems

    Bulletin of the American Mathematical Society

    (1969)
  • S. Grossberg

    Pavlovian pattern learning by nonlinear neural networks

    Proceedings of the National Academy of Sciences

    (1971)
  • Cited by (52)

    • Multi-periodicity of switched neural networks with time delays and periodic external inputs under stochastic disturbances

      2021, Neural Networks
      Citation Excerpt :

      The theoretical analysis of neurodynamic behaviors is an integral part of neural network research serving to underpin recurrent neural network model designs and their applications, Zhang et al. (2014). Existing neurodynamic analysis research covers a wide spectrum of topical areas, such as asymptotic stability (e.g., Arik (2002), Arik and Tavsanoglu (2000a), Cao and Wang (2003, 2005a), Chen et al. (2019), Di Marco et al. (2018), Hu and Wang (2002a, 2002b) and Zhang et al. (2018, 2014)), exponential stability (e.g., Arik (2004), Cao and Wang (2005b), Forti et al. (2005) and Senan and Arik (2005)), absolute stability (e.g., Arik and Tavsanoglu (2000b), Cao and Wang (2004), Forti and Tesi (2004) and Liang and Wang (2001)), robust stability (e.g., Arik (2003, 2013, 2014), Cao and Wang (2005a), Faydasicok and Arik (2012), Guo et al. (2014c) and Shen and Wang (2012)), stochastic stability (e.g., Chen et al. (2015), Li and Ding (2017), Li et al. (2019), Shen and Wang (2009), Wang and Wu (2010), Zhang and Wang (2008) and Zhu et al. (2015, 2016)), complete stability (e.g., Forti (2002), Liu et al. (2021, 2017a), Nie and Zheng (2015a) and Zeng and Wang (2006a)), multiple stability (e.g., Chen et al. (2017), Guo et al. (2019), Guo, Ou et al. (2020), Liu et al. (2021, 2016, 2017b, 2018), Nie and Cao (2011), Nie and Zheng (2015b), Wan and Liu (2020), Wan et al. (2020), Zeng and Zheng (2013) and Zhang and Zeng (2020)), periodicity (e.g., Cai et al. (2020), Chen and Wang (2005, 2007), Hu and Wang (2015), Huang et al. (2002, 2005), Liu and Cao (2009), Liu, Du et al. (2017), Nie and Huang (2012) and Wang et al. (2019, 2014)), dissipativity (e.g., Guo et al. (2013) and Liao and Wang (2003)), attractivity (e.g., Guo et al. (2014a) and Zeng and Wang (2006b)), passivity (e.g., Guo et al. (2014b), Lian and Wang (2015), and Li and Zheng (2020)), synchronization (e.g., Guo et al. (2021), Guo, Wang et al. (2015), Guo, Yang et al. (2015), Li et al. (2020), Liu et al. (2011), Liu, Zeng et al. (2019a), Liu et al. (2020), Sun et al. (2020), Wang et al. (2020) and Yang et al. (2015)), and so on. Periodic oscillation is an important type of dynamical behaviors because periodic phenomena exist widely in many systems, such as biological and electronic systems.

    • Periodicity of Cohen–Grossberg-type fuzzy neural networks with impulses and time-varying delays

      2019, Neurocomputing
      Citation Excerpt :

      As we known, time delay may cause some poor performances such as instability, oscillation, and so on [22]. In recent years, the study of periodic oscillation of neural networks with delays have been published, for example, see [20–22], and references therein. On the other hand, in some circumstances, there are several abrupt changes at certain moments in physical systems since instantaneous disturbances, which is called impulsive effects [23].

    • Novel delay-dependent exponential stability criteria for neutral-type neural networks with non-differentiable time-varying discrete and neutral delays

      2016, Neurocomputing
      Citation Excerpt :

      During the last few decades, there have been extensively investigations on the stability analysis of neural networks due to their extensively applications such as signal processing, pattern recognition, associative memories, and combinatorial optimization, see [3,7–12]. An important type of neural networks is the recurrent neural networks with time delays which has broad applications in modeling dynamic behavior of many biological and cognitive activities such as heartbeat, respiration, mastication, locomotion, and memorization, see [4]. In performing a periodicity or stability analysis of a neural network, the conditions to be imposed on the neural network are determined by the characteristics of various activation functions and network parameters.

    View all citing articles on Scopus

    This work was supported by the Natural Science Foundation of China under Grant 60274007.

    View full text