Elsevier

Neural Networks

Volume 24, Issue 2, March 2011, Pages 143-147
Neural Networks

Neural networks letter
Tree-like hierarchical associative memory structures

https://doi.org/10.1016/j.neunet.2010.09.012Get rights and content

Abstract

In this letter we explore an alternative structural representation for Steinbuch-type binary associative memories. These networks offer very generous storage capacities (both asymptotic and finite) at the expense of sparse coding. However, the original retrieval prescription performs a complete search on a fully-connected network, whereas only a small fraction of units will eventually contain desired results due to the sparse coding requirement. Instead of modelling the network as a single layer of neurons we suggest a hierarchical organization where the information content of each memory is a successive approximation of one another. With such a structure it is possible to enhance retrieval performance using a progressively deepening procedure. To backup our intuition we provide collected experimental evidence alongside comments on eventual biological plausibility.

Introduction

Recent advances on the study of binary Steinbuch-type associative memories (Steinbuch, 1961) have shed light on the importance of their structural representation concerning achievable network performance (Knoblauch, Palm, & Sommer, 2010). For instance, as a means to enhance storage capacity, an alternative representation scheme based on Huffman or Goulomb coding has been proposed (Knoblauch, 2003). Instead of encoding synaptic connectivity through its immediate weight matrix form, it has been shown that if a compressed variant is used, the classic asymptotic upper bound on storage capacity may be increased.

In this letter, we consider a novel hierarchical approach to associative memory structural representation with positive implications on retrieval performance. Furthermore, this procedure establishes possible links to current neurobiology theories, complying with the low energy consumption requirements of the brain (Laughlin, 2001, Lennie, 2003).

Whereas in conventional Steinbuch-type memories information is stored within a single neural network, in our method it is spread across an ensemble of hierarchically arranged networks of increased resolution. These additional networks are successive approximated (or ‘compressed’) versions of each other and allow for early selective filtering of relevant neurons, pruning unnecessary units from the query process while progressively reaching a final result.

Hierarchical memories are already a well-known concept in neural network architecture theory but have been employed for different purposes. Such structures have been used by a trend of research (Hirahara et al., 2000, Kakeya and Kindo, 1996, Štanclová and Zavoral, 2005) to store naturally correlated patterns, which would otherwise quickly saturate the network and introduce intolerable errors in the retrieval process.

Section snippets

Binary associative memories

Steinbuch-type associative memories have been the object of exhaustive analyses since their inception in the early sixties. Palm (1980) and Willshaw, Buneman, and Longuet-Higgins (1969) provided a missing rigorous mathematical model description and the first formal studies on their performance. Instead of the original electrical circuit formulation envisioned by Steinbuch, they presented the model as a single-layer feedforward network of binary threshold neurons.

These memories establish a

On the computational complexity of retrieval

Steinbuch memories naturally benefit from specialized massively parallel hardware implementations (see for instance Palm and Palm (1991)), as each neuron may independently perform both learning and retrieval, given that the computation is synchronous. In such a computer equipped with n processors, one for each neuron, the retrieval process will be proportional in time to the number of ‘1’ elements of the input pattern x̃. As the activity level |x̃|0 is close to |x|0, which is of logarithmic

Tree-like hierarchical memories

By inspection of the Hierarchical Subspace Tree due to Wichert (2009) we were led to an interesting question: could the properties of a tree-like structure be applied to binary associative memories in order to improve retrieval performance and minimize energy consumption?

In pursuit of this premise we conceived a hierarchical structural representation suitable for the general associative memory task. The simple intuition behind our method can be grasped through the analysis of Eq. (4). Whenever

Experimental results

Through a series of numerical simulations on a sequential computer we have measured the effective retrieval costs associated with varying hierarchy dispositions. The experiments have been conducted on a mid-sized square associative memory with m=n=2000 neurons. Without loss of generality, the memory performed an auto-associative task—similar results should be observed at equivalent (i.e., with a higher number of stored associations M) capacity loads for hetero-association.

As Fig. 2 highlights,

Conclusions

Nearest neighbour determination for binary patterns is a well-known problem for which numerous approaches exist—take for instance locality-sensitive hashing (Andoni, Datar, Immorlica, Indyk, & Mirrokni, 2006) as a recent method developed within the computer science community. Neural associative memories are biologically inspired solutions for this problem.

We have chosen to apply our hierarchical representation scheme to the classic ‘Steinbuch-Willshaw-Palm’ model due to its amenity to analysis,

Acknowledgements

The authors wish to express their gratitude towards Jan Cederquist and the anonymous reviewers for their helpful and incisive comments on early versions of this manuscript. The authors are also indebted to Alexander Pattenden for proofreading and suggesting useful corrections to the language. This work was supported by Fundação para a Ciência e Tecnologia (INESC-ID multiannual funding) through the PIDDAC Program funds and through an individual doctoral grant awarded to the first author

References (26)

  • A. Andoni et al.

    Locality-sensitive hashing scheme based on p-stable distributions

  • B. Graham et al.

    Capacity and information efficiency of a brain-like associative net

  • B. Graham et al.

    Improving recall from an associative memory

    Biological Cybernetics

    (1995)
  • Cited by (7)

    • Regarding the temporal requirements of a hierarchical Willshaw network

      2012, Neural Networks
      Citation Excerpt :

      The rest of this paper is organised as follows. In Section 2, we review the network model presented in Sacramento and Wichert (2011) and derive exact expectations for the time requirements of learning and retrieval. Then, in Section 3, we analyse the optimisation task of determining the hierarchical configuration which minimises the time expectation we obtain.

    • Quantum Artificial Intelligence with Qiskit

      2024, Quantum Artificial Intelligence with Qiskit
    • Quantum Lernmatrix

      2023, Entropy
    • Taxonomical Associative Memory

      2014, Cognitive Computation
    View all citing articles on Scopus
    View full text