Elsevier

Neurocomputing

Volume 77, Issue 1, 1 February 2012, Pages 101-107
Neurocomputing

Analysis and design of associative memories based on recurrent neural network with discontinuous activation functions

https://doi.org/10.1016/j.neucom.2011.08.026Get rights and content

Abstract

This paper considers a recurrent neural network (RNN) with a special class of discontinuous activation function which is piecewise constants in the state space. One sufficient condition is established to ensure that the novel recurrent neural networks can have (4k1)n locally exponential stable equilibrium points. Such RNN is suitable for synthesizing high-capacity associative memories. The design procedure is presented with the method of singular value decomposition. Finally, the validity and performance of the results are illustrated by use of two numerical examples.

Introduction

Associative memories are brain-style devices designed to memorize a set of patterns such that the stored patterns can be retrieved with the initial probes containing sufficient information about the patterns. Associative memories include autoassociative memory and heteroassociative memory.

As models of human brains, neural networks have been used as associative memory. In the past decades, dynamics of recurrent neural network (RNN) with their various generalizations, including cellular neural network [1] (CNN) as its special case, have been extensively investigated (see, e.g., [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23] and their references) for their applications to associative memory, pattern recognition, optimization computation, etc. It has been shown that n-neuron RNN with one step (k steps) piecewise linear activation function can have 3n ((4k1)n) equilibrium points and 2n ((2k)n) points of them located in saturation regions are locally exponentially stable [9], [20].

In the application, neural network with discontinuous activation function [24] is frequently encountered. Global stability and global exponential convergence of the state and output solutions of neural network with monotone nondecreasing discontinuous activation function were discussed by Forti et al. in [25], [26], [27], [28]. It is worth to note that some conditions were obtained to ensure global convergence in finite time in [25], [28]. Wang et al. [29], [30], [31] generalized Forti's work by discharging the boundedness and monotonicity of the discontinuous activation function. Huang and Cao [7] introduced a class of neural networks with a special discontinuous activation function which has n2 isolated equilibrium points which are locally exponentially stable.

Many researchers used RNN with several kinds of continuous activation functions for associative memory. Two methods have been proposed for designing associative memories based on cellular neural networks in [32], [33], [34], [35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52]. In the first method, RNN with multiple locally stable equilibrium points are directly regarded as associative memories [32], [33], [34], [35], [48], [49], [50], [51], [52]. In this case, the addressable memories or patterns are stored as stable equilibria or periodic orbits. While the second method is to design RNN with unique equilibrium points and ensure that each trajectory globally converges to a unique equilibrium point which depends on the external input instead of initial state [45], [46], [47]. To the best of our knowledge, RNN with discontinuous activation function is not considered as associative memory. Motivated by the above discussion, we investigate a class of recurrent neural networks with a special discontinuous activation function which is piecewise constant in the state space. One sufficient condition is obtained to ensure that the novel neural networks can have (4k1)n locally exponentially stable equilibrium points. And then, synthesis procedure of associative memories is presented by just modifying the method described in [42].

The rest of this paper is organized as the following. In Section 2, stability analysis for associative memories is presented. Design procedure of neural network for associative memories is given in Section 3. Simulation results of two examples are shown in Section 4. Finally, the study in this paper is concluded in Section 5.

Section snippets

Stability analysis for associative memories

In this section, stability analysis of RNN with discontinuous activation function is presented. By dividing the state space into (4k1)n subspaces, one sufficient condition is obtained to guarantee that there exists one equilibrium point in each subspace.

Synthesis of neural network

In this section, we present a design procedure for system (1) by modifying the synthesis procedure developed in [39].

For a given positive integer k, let M={34k,54k,,3,1,0,1,3,,4k5,4k3} andMn={ξRn,ξ=(ξ1,ξ2,,ξn)TξiM,i=1,2,,n}.

Hence, M and Mn composed of 4k1 and (4k1)n elements, respectively.

Illustrative examples

In this section, two examples are given for illustrating the obtained results and the synthesis procedure, respectively.

Consider the following RNN with discontinuous activation function:x˙1(t)=x1(t)+0.7f(x1(t))+0.5f(x2(t))+0.5f(x1(t0.3))0.51f(x2(t0.3))+0.1;x˙2(t)=x2(t)0.62f(x1(t))+0.6f(x2(t))+0.6f(x1(t0.1))+0.7f(x2(t0.1))0.2.Example 1 is almost the same as the example in [20] by just replacing the piecewise linear functionf(z)=5,z[5,+),2z5,z[3,5),1,z[1,3),z,z(1,1),1,z(3,1],2z

Concluding remarks

In the present paper, we investigated a class RNN with discontinuous activation function. It has been shown that such n dimensional RNN can have (4k1)n locally exponentially stable isolated equilibrium points. The local number of local exponentially stable equilibrium points is more than that of the existing neural network model. Multi-stability in recurrent neural networks is an crucial issue when it is used to associative memories. Increasing storage capacity is a fundamental problem, hence,

Acknowledgments

The authors would like to thank the associate editor and the referees for their detailed comments and valuable suggestions which considerably improved the presentation of the paper. The work is supported by the Natural Science Foundation of China under Grant 60974021, the 973 Program of China under Grant 2011CB710606, the Fund for Distinguished Yong Scholars of Hubei Province under Grant 2010CDA081, the Specialized Research Fund for the Doctoral Program of Higher Education of China under Grant

Gang Bao received the B.S. degree in mathematics from Hubei Normal University, Huangshi, China, the M.S. degree in applied mathematics from Beijing University of Technology, Beijing, China, in 2000 and 2004, respectively. At present, he is working hard for Ph.D. degree Department of Control Science and Engineering, Huazhong University of Science and Technology, and Image Processing and Intelligent Control Key Laboratory of Education Ministry of China, Wuhan, Hubei 430074, China.

References (54)

  • L.P. Li et al.

    Global asymptotic stability of delayed neural networks with discontinuous neuron activations

    Neurocomputing

    (2009)
  • J. Park et al.

    A synthesis procedure for associative memories based on space-varying cellular neural networks

    Neural Networks

    (2001)
  • A.C.B. Delbem et al.

    Design of associative memories using cellular neural networks

    Neurocomputing

    (2009)
  • L.O. Chua et al.

    Cellular neural networks: theory

    IEEE Trans. Circuits Syst.

    (1998)
  • L.L. Wang et al.

    Multistability and new attraction basins of almost-periodic solutions of delayed neural networks

    IEEE Trans. Neural Network

    (2009)
  • Z. Yi et al.

    Multistability analysis for recurrent neural networks with unsaturating piecewise linear transfer functions

    Neural Comput.

    (2003)
  • Z.G. Zeng et al.

    Stability analysis of delayed cellular neural networks described using cloning templates

    IEEE Trans. Circuits Syst. I

    (2004)
  • H. Zhang et al.

    Global asymptotic stability of recurrent neural networks with multiple time varying delays

    IEEE Trans. Neural Network

    (2008)
  • H. Zhang et al.

    Robust stability analysis for interval Cohen–Grossberg neural networks with unknown time varying delays

    IEEE Trans. Neural Network

    (2008)
  • H. Zhang et al.

    Global asymptotic stability and robust stability of a general class of Cohen–Grossberg neural networks with mixed delays

    IIEEE Trans. Circuits Syst. I

    (2009)
  • L. Zhang et al.

    Activity invariant sets and exponentially stable attractors of linear threshold discrete-time recurrent neural networks

    IEEE Trans. Autom. Control

    (2008)
  • L. Zou et al.

    Nontrivial global attractors in 2-D multistable attractor neural networks

    IEEE Trans. Neural Network

    (2009)
  • K. Yokosawa et al.

    Cellular neural networks with output function having multiple constant regions

    IEEE Trans. Circuits Syst. I

    (2003)
  • Z.G. Zeng et al.

    Multiperiodicity and exponential attractivity evoked by periodic external inputs in delayed cellular neural networks

    Neural Comput.

    (2006)
  • L. Zhang et al.

    Multiperiodicity and attractivity of delayed recurrent neural networks with unsaturating piecewise linear transfer functions

    IEEE Trans. Neural Network

    (2008)
  • M. Gonzlez et al.

    Block attractor in spatially organized neural networks

    Neurocomputing

    (2009)
  • C.Y. Cheng et al.

    Multistability in recurrent neural networks

    SIAM J. Appl. Math.

    (2006)
  • Cited by (92)

    • A multi-layer memristive recurrent neural network for solving static and dynamic image associative memory

      2019, Neurocomputing
      Citation Excerpt :

      At the same time, a RNN based on memory can be constructed by using memristive neuron synaptic connections [23]. The literatures [24–26] use different activation functions to demonstrate its stability by derivation and experimental simulation. Therefore, the activation function is the core of the RNN associative memory.

    View all citing articles on Scopus

    Gang Bao received the B.S. degree in mathematics from Hubei Normal University, Huangshi, China, the M.S. degree in applied mathematics from Beijing University of Technology, Beijing, China, in 2000 and 2004, respectively. At present, he is working hard for Ph.D. degree Department of Control Science and Engineering, Huazhong University of Science and Technology, and Image Processing and Intelligent Control Key Laboratory of Education Ministry of China, Wuhan, Hubei 430074, China.

    Zhigang Zeng received the B.S. degree in mathematics from Hubei Normal University, Huangshi, China, in 1993, the M.S. degree in ecological mathematics from Hubei University, Wuhan, China, in 1996, and the Ph.D. degree in systems analysis and integration from Huazhong University of Science and Technology, Wuhan, China, in 2003.

    Currently, he is a Professor at the Department of Control Science and Engineering, Huazhong University of Science and Technology. His research interests include neural networks, switched systems, computational intelligence, stability analysis of dynamical systems, pattern recognition, and associative memories.

    View full text