Elsevier

Neural Networks

Volume 11, Issue 5, July 1998, Pages 785-792
Neural Networks

Neural networks letter
A combined evolution method for associative memory networks

https://doi.org/10.1016/S0893-6080(98)00041-0Get rights and content

Abstract

In the study of associative memory networks, the updating rule remains unchanged during neuron evolution. When evolution stops at an undesired minimum, simulated annealing is used to resolve the problem. In this paper, a combined neuron evolution method based on a multipath network architecture is presented. It is shown that, by controlling the evolution path, the probability that evolution terminates in undesired, minima is significantly reduced. Once evolution is trapped in an undesired minimum with respect to one path, the method seeks an alternative path to carry on the evolution. The process continues until a desired minimum is reached. Visual examples are used to demonstrate the performance of the method.

Introduction

Associative memory networks have a very important status in the history of neurocomputing. One example is the Hopfield network which contributed significantly to modern neural network research (Hopfield, 1982). Associative memory networks have some desirable properties: (a) the associative memory networks have recurrent structures which lead to fast learning rate (Hecht-Nielsen, 1989); (b) associative memory networks have strong self-organizing features. Such networks have the potential to resolve ill-conditioned problems; (c) the structure of associative memory networks is simple. They are potentially implementable in extremely large sizes in inexpensive hardware. Associative memory networks have found numerous applications in optimization (Hopfield and Tank, 1985), pattern recognition (Nasrabadi and Li, 1992), and image processing (Paik and Katsaggelos, 1992). One particularly attractive feature of associative networks is that such networks can recover desired features from distorted patterns. This ability makes them good candidates for information retrieval in multimedia systems.

The associative memory networks have two major problems, the capacity of the networks and the undesired minima. Attention has been paid to the resolution of the capacity problem in the past (Hains and Hecht-Nielsen, 1988; McEliece, 1977; Telfer and Casasent, 1990; Geva and Sitte, 1991). The solutions to the undesired minima problem were normally obtained as by-products to the solutions of the capacity problem. One scheme to alleviate undesired minima problem is simulated annealing (Kirkpatric et al., 1983). Simulated annealing employs the idea that, by adding `noise' to the current state, evolution may `jump' out of the undesired minima and move down to a desired minimum.

Simulated annealing works well when the desired minima of the energy function have exactly the same energy level. However, the desired minima with a learning rule may have different energy levels. This may cause problems in practical applications such as in information retrieval when associative memory networks is used to construct databases in multimedia systems. If the network is trained such that the desired minima corresponding to the stored information patterns have different energy levels, simulated annealing may treat the desired minima with higher energy levels as undesired minima, and the corresponding patterns will never be retrieved. Further, if some of the undesired minima happen to have lower energy levels, evolution based on simulated annealing will terminate in one of the undesired minima.

This paper introduces a combined evolution method based on a multipath architecture in associative memory networks. Multiple connections are used in the network for training and evolution. In training, each learning rule is utilized to train one particular path of the connections. The motivation behind this approach is that different learning rules produce the same desired minima, but the probability they produce the same undesired minima is extremely small. This argument is formally verified, and justified by experimental results.

Section snippets

The proposed method

In this section, the multipath architecture is first presented. It is then shown, via probability analysis, that the undesired minima can largely be avoided by using the proposed architecture. Finally, an implementation algorithm is given.

In general, searching for a solution in an associative memory network is equivalent to minimizing the energy function E associated with a learning ruleE=XTWXwhere X is the pattern space, and W is the connection matrix generated by the learning rule. As long as

Experiments and discussions

The proposed approach was initially tested by two commonly used neuron evolution rules: the Hebbian learning rule (Hebb, 1949)—the outer products version (Fausett, 1994), and the auto-associative optimal linear associative memory (OLAM) rule (Kohonen, 1988).

Numerical examples were used to test the performance of the algorithm. In this section, the results of two experiments are illustrated. In the first example, the eight example patterns shown in Fig. 2(a) were used in the learning process.

Conclusions

This paper presents a combined neuron evolution algorithm based on a multipath architecture to resolve undesired minima in associative memory networks. Statistical analysis is provided to justify the feasibility of the proposed scheme. Analysis and experimental results indicate that the probability that evolution terminates in an undesired minimum is substantially reduced. Owing to its ability to resolve undesired minima, the proposed method provides a potential solution to pattern matching in

References (14)

  • Cheng, A. C. C., Guan, L. & Cheung, R. (1994). A novel learning framework for associative neural networks. In Proc....
  • Fausett, L. (1994). Fundamentals of Neural Networks. Englewood Cliffs, NJ:...
  • S. Geva et al.

    An exponential response neural net

    Neural Computations

    (1991)
  • Hains, K. & Hecht-Nielsen, R. (1988). A BAM with increased information storage capacity. In Proc. of the Int. Conf on...
  • Hebb, D. (1949). The Organization of Behaviour. New York:...
  • Hecht-Nielsen, R. (1989). Neurocomputing. New York:...
  • J.J. Hopfield

    Neural networks and physical systems with emergent collective computational abilities

    Proc. Natl Acad. Sci.

    (1982)
There are more references available in the full text version of this article.

Cited by (4)

  • A structural representing and learning model based on biological neural mechanism

    2004, Proceedings of 2004 International Conference on Machine Learning and Cybernetics
View full text