Loading [a11y]/accessibility-menu.js
Cross-modal associative memory by MultiSOM | IEEE Conference Publication | IEEE Xplore

Cross-modal associative memory by MultiSOM


Abstract:

This paper proposed a novel Associative Memory model base on Self-organization Map(SOM), called MultiSOM. This model could learn associative relationships between data fr...Show More

Abstract:

This paper proposed a novel Associative Memory model base on Self-organization Map(SOM), called MultiSOM. This model could learn associative relationships between data from different sources, mostly in different modality. However, data and relationships between them will not be entered into the network and trained directly. Instead, they should be trained each with a same semantic data and at last share one topological map. Cross-modally, this paper trains the MultiSOM model to learn associative memory between images and human voice of Chinese characters, with their meanings as sematic data, and the experiment results suggest that this MultiSOM model could learn the bidirectional associative relationship.
Date of Conference: 11-14 May 2014
Date Added to IEEE Xplore: 23 October 2014
ISBN Information:
Conference Location: Aalborg, Denmark

Contact IEEE to Subscribe

References

References is not available for this document.