Neural networks letterTree-like hierarchical associative memory structures
Introduction
Recent advances on the study of binary Steinbuch-type associative memories (Steinbuch, 1961) have shed light on the importance of their structural representation concerning achievable network performance (Knoblauch, Palm, & Sommer, 2010). For instance, as a means to enhance storage capacity, an alternative representation scheme based on Huffman or Goulomb coding has been proposed (Knoblauch, 2003). Instead of encoding synaptic connectivity through its immediate weight matrix form, it has been shown that if a compressed variant is used, the classic asymptotic upper bound on storage capacity may be increased.
In this letter, we consider a novel hierarchical approach to associative memory structural representation with positive implications on retrieval performance. Furthermore, this procedure establishes possible links to current neurobiology theories, complying with the low energy consumption requirements of the brain (Laughlin, 2001, Lennie, 2003).
Whereas in conventional Steinbuch-type memories information is stored within a single neural network, in our method it is spread across an ensemble of hierarchically arranged networks of increased resolution. These additional networks are successive approximated (or ‘compressed’) versions of each other and allow for early selective filtering of relevant neurons, pruning unnecessary units from the query process while progressively reaching a final result.
Hierarchical memories are already a well-known concept in neural network architecture theory but have been employed for different purposes. Such structures have been used by a trend of research (Hirahara et al., 2000, Kakeya and Kindo, 1996, Štanclová and Zavoral, 2005) to store naturally correlated patterns, which would otherwise quickly saturate the network and introduce intolerable errors in the retrieval process.
Section snippets
Binary associative memories
Steinbuch-type associative memories have been the object of exhaustive analyses since their inception in the early sixties. Palm (1980) and Willshaw, Buneman, and Longuet-Higgins (1969) provided a missing rigorous mathematical model description and the first formal studies on their performance. Instead of the original electrical circuit formulation envisioned by Steinbuch, they presented the model as a single-layer feedforward network of binary threshold neurons.
These memories establish a
On the computational complexity of retrieval
Steinbuch memories naturally benefit from specialized massively parallel hardware implementations (see for instance Palm and Palm (1991)), as each neuron may independently perform both learning and retrieval, given that the computation is synchronous. In such a computer equipped with processors, one for each neuron, the retrieval process will be proportional in time to the number of ‘1’ elements of the input pattern . As the activity level is close to , which is of logarithmic
Tree-like hierarchical memories
By inspection of the Hierarchical Subspace Tree due to Wichert (2009) we were led to an interesting question: could the properties of a tree-like structure be applied to binary associative memories in order to improve retrieval performance and minimize energy consumption?
In pursuit of this premise we conceived a hierarchical structural representation suitable for the general associative memory task. The simple intuition behind our method can be grasped through the analysis of Eq. (4). Whenever
Experimental results
Through a series of numerical simulations on a sequential computer we have measured the effective retrieval costs associated with varying hierarchy dispositions. The experiments have been conducted on a mid-sized square associative memory with neurons. Without loss of generality, the memory performed an auto-associative task—similar results should be observed at equivalent (i.e., with a higher number of stored associations ) capacity loads for hetero-association.
As Fig. 2 highlights,
Conclusions
Nearest neighbour determination for binary patterns is a well-known problem for which numerous approaches exist—take for instance locality-sensitive hashing (Andoni, Datar, Immorlica, Indyk, & Mirrokni, 2006) as a recent method developed within the computer science community. Neural associative memories are biologically inspired solutions for this problem.
We have chosen to apply our hierarchical representation scheme to the classic ‘Steinbuch-Willshaw-Palm’ model due to its amenity to analysis,
Acknowledgements
The authors wish to express their gratitude towards Jan Cederquist and the anonymous reviewers for their helpful and incisive comments on early versions of this manuscript. The authors are also indebted to Alexander Pattenden for proofreading and suggesting useful corrections to the language. This work was supported by Fundação para a Ciência e Tecnologia (INESC-ID multiannual funding) through the PIDDAC Program funds and through an individual doctoral grant awarded to the first author
References (26)
Characteristics of sparsely encoded associative memory
Neural Networks
(1989)- et al.
Information storage and effective data retrieval in sparse matrices
Neural Networks
(1989) - et al.
A cascade associative memory model with a hierarchical memory structure
Neural Networks
(2000) - et al.
Hierarchical concept formation in associative memory composed of neuro-window elements
Neural Networks
(1996) - et al.
Pattern separation and synchronization in spiking associative memories and visual areas
Neural Networks
(2001) Energy as a constraint on the coding and processing of sensory information
Current Opinion in Neurobiology
(2001)The cost of cortical computation
Current Biology
(2003)- et al.
Storing and restoring visual input with collaborative rank coding and associative memory
Neurocomputing
(2006) - et al.
Improved bidirectional retrieval of sparse patterns stored by Hebbian learning
Neural Networks
(1999) Cell assemblies for diagnostic problem-solving
Neurocomputing
(2006)
Locality-sensitive hashing scheme based on -stable distributions
Capacity and information efficiency of a brain-like associative net
Improving recall from an associative memory
Biological Cybernetics
Cited by (7)
Regarding the temporal requirements of a hierarchical Willshaw network
2012, Neural NetworksCitation Excerpt :The rest of this paper is organised as follows. In Section 2, we review the network model presented in Sacramento and Wichert (2011) and derive exact expectations for the time requirements of learning and retrieval. Then, in Section 3, we analyse the optimisation task of determining the hierarchical configuration which minimises the time expectation we obtain.
Quantum Artificial Intelligence with Qiskit
2024, Quantum Artificial Intelligence with QiskitQuantum Lernmatrix
2023, EntropyA simplified computational memory model from information processing
2016, Scientific ReportsTaxonomical Associative Memory
2014, Cognitive ComputationFeature selection using associative memory paradigm and parallel computing
2013, Computacion y Sistemas