Elsevier

Biosystems

Volume 89, Issues 1–3, May–June 2007, Pages 257-263
Biosystems

Pattern storage in a sparsely coded neural network with cyclic activation

https://doi.org/10.1016/j.biosystems.2006.04.023Get rights and content

Abstract

We investigate an artificial neural network model with a modified Hebb rule. It is an auto-associative neural network similar to the Hopfield model and to the Willshaw model. It has properties of both of these models. Another property is that the patterns are sparsely coded and are stored in cycles of synchronous neural activities. The cycles of activity for some ranges of parameter increase the capacity of the model. We discuss basic properties of the model and some of the implementation issues, namely optimizing of the algorithms. We describe the modification of the Hebb learning rule, the learning algorithm, the generation of patterns, decomposition of patterns into cycles and pattern recall.

Introduction

In his famous book Donald Hebb suggested a mechanism describing how neurons should interact in order to store information Hebb, 1949, Sejnowski, 1999. Basically, activity concurrent in space and time in both pre- and post-synaptic neuron is required to strengthen the synapse. The modification of synapses is the mechanism underlying memory storage and retrieval in both artificial and biological neural networks (Rolls and Treves, 1998).

The original visionary Hebb formulation is general enough to exist in many variants of how the pre- and post-synaptic activity are linked. In the case of the feedback network, the Hebb rule links the forward activity with the backward, feedback activity. In this case the issue of the timing within the Hebb rule is especially crucial, because the feedback strength and the feedback time constant implying the delay of the feedback are not independent within the feedback dynamics. One of the first artificial networks using the Hebb rule is now known as the Hopfield model. In Hopfield original words the timing issue of the Hebb rule in his model needed “some appropriate calculation over past history” and indeed in Hopfield (1982) the Hebb rule and other definitions make it possible that the network converges using the feedback to a well defined optimum energy state. This original model of Hopfield has limitations, however. One particular limitation is the condition for the overall activity of the input pattern to be in the range around the 50% of activity to assure the best network functioning. This would require pattern preprocessing in the case when Hopfield model is used as an artificial neural network and stores patterns which largely differ from this 50% activity requirement. And also such a high activity is not typical in biological cortical neural network, when we want to discuss the Hopfield network as a model for distributed memory within the context of neuroscience. Possible ways how to work around these limitations are based on modifying the Hebb rule. This explains why we believe that there is still need for studying new variants of the Hebb learning rule in new versions of artificial neural networks.

New learning rules and new implementation solutions might eventually bring some new concepts for reconnecting the disparate areas of artificial and biological networks. We do not attempt here to conquer the ambitious goal of finding a direct application of an artificial network to describe a biological one. Here, instead, we propose one particular solution of an implementation of the classical Hebb rule. We present several interesting results obtained during implementation of this new rule.

Willshaw et al. (1969) was one of the first to propose an associative neural network model. An application of the Hebb rule in an artificial neural network was in the discrete and continuous versions of Hopfield network described in Hopfield, 1982, Hopfield, 1984, respectively. A network with sparse activity was studied in Amit et al. (1987).

Section snippets

Model definition

The model consists of n formal neurons. Each neuron is connected with all the others and also self connections are present. The output of the neuron is expressed as:xj=Hi=1nwijxi+ϑ,where xj is the output of the jth neuron, wij the weight of the synapse connection from the ith neuron to the jth neuron and ϑ is the threshold which is the same for all neurons. Function H is a hard-limiter activation function, whereH(x)=1forx>00otherwise.The synaptic weights and also the outputs of neurons attain

Network capacity

Natural question concerning a neural network model is its capacity. We present here an analysis which leads to the estimate of capacity. Afterwards, we provide computer simulations which confirm our esti- mates.

Assume first, that the relative sub-pattern activity in the network is a constant small number near 0 and the number of neurons is reaching infinity. We suppose that the network would be operating well until the critical filling of a weight matrix would be reached by the presented

Implementation and algorithms

The pattern generation and testing processes are relatively fast. The time complexity of the network learning and of phases of network dynamics depend on the square of the number of neurons, thus these consume most of the computation time.

The phase of network dynamics cannot be optimized that much because we have to go through all the values in the corresponding equations. We cannot skip evaluation of some parts and cannot assume that they have some predictable value. Unlike these phases, the

Discussion

The Hopfield network is one of the simplest adaptations of the Hebb rule in artificial networks. In biological networks with a small number of cells some variants of the Hebb rule were also demonstrated in several temporal and spatial scales (Gerstner et al., 1996). Most of the artificial neural networks using the Hebb rule operate in regimes with high average activity. In contrast, average activities observed in biological networks are low. Our work was motivated by a search for a set of

Conclusion

We have presented a model of a neural network with a modified Hebb rule and with an optimized learning algorithm. The decomposition of patterns to cycles brings higher capacity and learning algorithm speed improvements. The higher capacity and speed gains might be suitable for real time applications. Attention must be paid when implementing the recall process. This can be done either as inhibitory dynamics, or as cyclic encoding, or as another algorithm, yet to be devised.

Acknowledgments

Thanks to Nick Dorrell. Supported by the GAUK grant no. 32/2005 (CU/1st Med. Fac. reg. no. 203212) and by the Czech Ministry of Education through the Research initiative no. MSM 6840770012.

References (17)

  • T.J. Sejnowski

    The book of Hebb

    Neuron

    (1999)
  • D.J. Amit et al.

    Information storage in neural networks with low level of activity

    Phys. Rev. A

    (1987)
  • J.E. Bresenham

    Algorithm for computer control of a digital plotter

    IBM Syst. J.

    (1965)
  • T.H. Brown et al.

    Hippocampus

  • R. Croft et al.

    The relative contributions of ecstasy and cannabis to cognitive impairment

    Psychopharmacology (Berlin)

    (2001)
  • W. Gerstner et al.

    A neuronal learning rule for sub-millisecond temporal coding

    Nature

    (1996)
  • D. Golomb et al.

    Willshaw model: associative memory with sparse coding and low firing rates

    Phys. Rev. A

    (1990)
  • D.O. Hebb

    The Organization of Behavior

    (1949)
There are more references available in the full text version of this article.

Cited by (0)

View full text