Elsevier

Neurocomputing

Volume 82, 1 April 2012, Pages 1-13
Neurocomputing

Multistability and multiperiodicity of high-order competitive neural networks with a general class of activation functions

https://doi.org/10.1016/j.neucom.2011.09.032Get rights and content

Abstract

In this paper, high-order synaptic connectivity is introduced into competitive neural networks and the multistability and multiperiodicity issues are discussed for high-order competitive neural networks with a general class of activation functions. Based on decomposition of state space, Halanay inequality, Cauchy convergence principle and inequality technique, some sufficient conditions are derived for ascertaining equilibrium points to be located in any designated region and to be locally exponentially stable. As an extension of multistability, some similar results are presented for ensuring multiple periodic solutions when external inputs and time delay are periodic. The obtained results are different from and less restrictive than those given by Nie and Cao (2009 [25]), and the assumption (H1A) by Nie and Cao (2009 [25]) is relaxed. It is shown that high-order synaptic connectivity plays an important role on the number of equilibrium points and their dynamics. As a consequence, our results refute traditional viewpoint: high-order synaptic connectivity has faster convergence rate and greater storage capacity than first-order one. Finally, three examples with their simulations are given to show the effectiveness of the obtained results.

Introduction

It is well known that neural networks play important roles in many applications, such as image processing and optimization problem, etc. [1], [2], [3], [4]. In [5], the authors pointed out that the traditional neural networks with first-order synaptic connections are shown to have some limitations such as limited capacity when used in pattern recognition and optimization problems. By incorporating high-order synaptic connectivity into neural networks, the networks would improve dramatically their storage capacity [6] and increase the class of optimization problem [7]. In recent years, the stability analysis of high-order neural networks has attracted the attention of many researchers due to the advantage of high-order synaptic connectivity [8], [9], [10], [11], [12], [13], [14], [15], [16]. On the other hand, in the applications of associative memory storage, pattern recognition and decision making, existence of many equilibrium points or periodic solutions is a necessary feature [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35]. We use the notion of “multistability” or “multiperiodicity” to describe coexistence of multiple stable equilibrium points or periodic orbits. However, it is worth noting that: (1) Most of the existing works of high-order neural networks have only studied mono-stability and mono-periodicity, seldom have considered multistability and multiperiodicity of neural networks with high-order synaptic connectivity [31]. (2) In many previous papers [8], [9], [12], [13], [14], authors claim that one of their main motivations of research relies on the following basic facts: high-order neural networks have stronger approximation property, faster convergence rate, greater storage capacity and higher fault tolerance than first-order ones. Thus, an interesting question arises: does high-order synaptic connectivity always imply faster convergence rate and greater storage capacity?

Based on above motivations and inspired by [31], in this paper, we shall introduce high-order synaptic connectivity into competitive neural networks and study the multistability and multiperiodicity of high-order competitive neural networks (HOCNNs) with a general class of activation functions, where the general class of activation functions consist of general bounded smooth sigmoidal functions including 1/(1+eξ), tanhξ and arctanξ as its special case. More precisely, we shall mainly focus on the following aspects:

  • For HOCNNs, we give some conditions to guarantee that the equilibrium point exists in any designated region, and is locally exponentially stable, by means of decomposition of state space, Halanay inequality, Cauchy convergence principle and inequality technique.

  • As an extension of multistability, we present the similar results for multiple periodic solutions when external inputs and time delay are periodic.

  • The obtained results are different from and less restrictive than those given in [25], and the assumption (H1A) in [25] is relaxed.

  • We reveal the effects of high-order synaptic connectivity on the existence and stability of equilibrium point or periodic solutions for HOCNNs and obtain a new insight: high-order synaptic connectivity may slow down convergence rate and reduce storage capacity of HOCNNs.

Section snippets

Model formulation and preliminaries

Competitive neural networks (CNNs) model the dynamics of cortical cognitive maps with unsupervised synaptic modifications. In this model, there are two types of state variable: that of the short-term memory (STM) describing the fast neural activity and that of long-term memory (LTM) describing the slow unsupervised synaptic modifications. The network can be arranged in a multilayer architecture, or the competition among the neurons can be implemented in just one competitive layer. The

Main results

In this section, we shall consider local exponential stability of equilibrium points for model (3) in each division region. Similar results can be also derived for HOCNNs with periodic delays and external inputs.

Theorem 1

Assume that Assumption 1, Assumption 2 hold. Then for each division N˜=(N1,N2,N3) of N, each ΩN˜ is positively invariant with respect to the solution flow generalized by system (3) with activation function (4).

Proof

For each division region ΩN˜ and any initial condition (ϕT,ψT)T =(ϕ1,,ϕN,ψ1,

Three illustrative examples

For convenience, we consider the following two-dimensional competitive neural networks:dxi(t)dt=aixi(t)+j=12Dijfj(xj(t))+j=12l=12Dijlτfj(xj(tτj(t)))fl(xl(tτl(t)))+j=12D¯ijtσtkij(ts)fj(xj(s))ds+BiSi(t)+Ii,dSi(t)dt=Si(t)+fi(xi(t)),i=1,2.

Example 1

For system (28), take a1=B1=1, a2=B2=2, D11=3, D22=5, D12=D21=D¯11=D¯22=0, D¯12=0.5, D¯21=0.5, I1=0.5, I2=1, σ=5, D1jlτ=D2jlτ=0(j,l=1,2), k11(s)=k22(s)=0, k12(s)=2e2s/(1e10), k21(s)=3e3s/(1e15), fi(x)=arctanx(i=1,2).

By simple computations, we

Conclusions

In this paper, the multistability and multiperiodicity issues have been studied for HOCNNs with mixed delays and a general class of activation functions.

  • (1)

    In view of the fact that neural information is often desired to be stored in a designated region, under some conditions, one obtained that HOCNNs have a unique locally exponentially stable equilibrium point in any designated region. Similar results are presented for multiple periodic solutions when external input and delays are periodic.

  • (2)

    We

Acknowledgments

The authors would like to thank the reviewers and the editor for their valuable suggestions and comments which have led to a much improved paper. This work was jointly supported by the National Natural Science Foundation of China under Grant 11072059 and 111087, the Foundation for Young Professors of Jimei University and the Foundation of Fujian Higher Education (JA10184, JA11154, JA11144)

Xiaobing Nie received the B.S. degree in mathematics from Yantai Normal University, Yantai, China, in 1999, the M.S. degree in mathematics/applied mathematics from East China Normal University, Shanghai, China, in 2002, and the Ph.D. degree in mathematics/applied mathematics from Southeast University, Nanjing, China, in 2010.

Since July 2002, he has been with the Department of Mathematics, Southeast University, Nanjing, China. He is a very active reviewer for many international journals and has

References (40)

Cited by (53)

View all citing articles on Scopus

Xiaobing Nie received the B.S. degree in mathematics from Yantai Normal University, Yantai, China, in 1999, the M.S. degree in mathematics/applied mathematics from East China Normal University, Shanghai, China, in 2002, and the Ph.D. degree in mathematics/applied mathematics from Southeast University, Nanjing, China, in 2010.

Since July 2002, he has been with the Department of Mathematics, Southeast University, Nanjing, China. He is a very active reviewer for many international journals and has published more than 10 journal papers. His research interests include neural networks, multistability theory, nonsmooth system.

Zhenkun Huang received the M.S. degree from Fuzhou University, Fuzhou, China, in 2004, and the Ph.D. degree from Zhejiang University, Hangzhou, China, in 2008, both in applied mathematics. From September 2001, he joined School of Science, Jimei University, Xiamen, China, where he is currently an Associated Professor. He is selected into Training Program for Distinguished Young Scholars and Research Talents of Fujian Higher Education. He is the author or coauthor of more than 30 journal papers in such journals as IEEE Transactions on Neural Networks , IEEE Transactions on Circuits and Systems I: Regular Papers, Physics Letters A, Nonlinear Dynamics, Journal of Mathematical Analysis and Applications and so on. He is also a Reviewer of Mathematical Reviews. His research interests include nonlinear systems, neural networks, and stability analysis in dynamic systems (including continuous, discrete, and impulsive systems).

View full text