Skip to main content
Log in

Modular Composite Representation

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

High-dimensional vector spaces have noteworthy properties that make them attractive for representation models. A reduced description model is a mechanism for encoding complex structures as single high-dimensional vectors. Moreover, these vectors can be used to directly process complex operations such as analogies, inferences, and structural comparisons. Also, it is possible to reconstruct the whole structure from the reduced description vector. Here, we introduce the modular composite representation (MCR), a new reduced description model that employs long integer vectors. We also describe several experiments with them, and give a theoretical analysis of the distance distribution in this vector space, and of properties of this representation. Finally, we compare MCR with other two reduced description models: Spatter Code and holographic reduced representation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. In the context of this work, we define elements as things that can be represented, for example, objects, actions, features, events, etc.

  2. Some systems can create reduced descriptions without explicitly defining these operations. For example see RAAM [25].

  3. Actually, XOR is a special case of the modular sum when r = 2.

  4. This vector could have been omitted, but we chose to follow Plate’s example that included it.

References

  1. Hinton GE, McClelland JL, Rumelhart DE. Distributed representations. In: Rumelhart DE, McClelland JL, editors. Parallel distributed processing: explorations in the microstructure of cognition volume 1: foundations. Cambridge, MA: MIT Press; 1986. p. 77–109.

    Google Scholar 

  2. Deerwester SC, Dumais ST, Furnas GW, Landauer TK, Harshman RA. Indexing by latent semantic analysis. J Am Soc Inf Sci. 1990;41(6):391–407.

    Article  Google Scholar 

  3. Franklin S, Patterson FGJ. The LIDA architecture: adding new modes of learning to an intelligent, autonomous, software agent. In IDPT-2006 Proceedings (integrated design and process technology): society for design and process science; 2006.

  4. Laird JE. Extending the Soar Cognitive Architecture. In: Wang P, Goertzel B, Franklin S, editors. Artificial general intelligence 2008. Amsterdam: IOS Press; 2008.

    Google Scholar 

  5. Hinton GE. Mapping part-whole hierarchies into connectionist networks. Artif Intell. 1990;46:47–75.

    Article  Google Scholar 

  6. Plate TA. Holographic reduced representations. IEEE Trans Neural Networks. 1995;6(3):623–41.

    Article  CAS  Google Scholar 

  7. Plate TA. Holographic reduced representation: distributed representation of cognitive structure. Stanford: CSLI; 2003.

    Google Scholar 

  8. Kanerva P. Hyperdimensional computing: an introduction to computing in distributed representation with high-dimensional random vectors. Cognit Comput. 2009;1(2):139–59.

    Article  Google Scholar 

  9. Stewart TC, Eliasmith C, editors. Neural planning and reasoning using the synaptic connections of the basal ganglia and thalamus. Biologically inspired cognitive architectures 2011. Washington, DC: IOS Press; 2011.

    Google Scholar 

  10. Franklin S. Artificial minds. Cambridge, MA: MIT Press; 1995.

    Google Scholar 

  11. Kanerva P. Sparse distributed memory. Cambridge, MA: The MIT Press; 1988.

    Google Scholar 

  12. Winston PH. Artificial intelligence. 3rd ed. Boston, MA: Addison Wesley; 1992.

    Google Scholar 

  13. Foundalis HE. PHAEACO: a cognitive architecture inspired by Bongard’s problems. PhD Thesis, Indiana University, Indiana; 2006.

  14. Jockel S. Crossmodal learning and prediction of autobiographical episodic experiences using a sparse distributed memory. PhD Thesis, University of Hamburg, Hamburg; 2009.

  15. Robertson P, Laddaga R. Learning to find structure in a complex world. In: Samsonovich AV, Jóhannsdóttir KR, editors. Biological inspired cognitive architectures 2011. Washington, DC: IOS Press; 2011.

  16. Sahlgren M, editor. An introduction to random indexing. Methods and applications of semantic indexing workshop at the 7th international conference on terminology and knowledge engineering, TKE 2005. Copenhagen, Denmark; 2005.

  17. Cohen T, Widdows D. Empirical distributional semantics: methods and biomedical applications. J Biomed Inform. 2009;42(2):390–405.

    Article  PubMed Central  PubMed  Google Scholar 

  18. Turney PD, Pantel P. From frequency to meaning: vector space models of semantics. J Artif Intell Res. 2010;37:141–88.

    Google Scholar 

  19. Cambria E, Hussain A. Sentic computing. Dordrecht: Springer; 2012.

    Book  Google Scholar 

  20. Grassi M, Cambria E, Hussain A, Piazza F. Sentic web: a new paradigm for managing social media affective information. Cognit Comput. 2011;3(3):480–9.

    Article  Google Scholar 

  21. Wang QF, Cambria E, Liu CL, Hussain A. Common sense knowledge for handwritten Chinese recognition. Cognit Comput. 2013;5(2):234–42.

    Article  Google Scholar 

  22. Cambria E, Hussain A. Sentic album: content-, concept-, and context-based online personal photo management system. Cognit Comput. 2012;4(4):477–96.

    Article  Google Scholar 

  23. Jones MN, Mewhort DJK. Representing word meaning and order information in a composite holographic lexicon. Psychol Rev. 2007;114:1–37.

    Article  PubMed  Google Scholar 

  24. Kanerva P. The binary spatter code for encoding concepts at many levels. In: Marinaro M, Morasso P, editors. ICANN’94: proceedings of international conference on artificial neural networks. London: Springer; 1994. p. 226–9.

    Google Scholar 

  25. Pollack JB. Recursive distributed representations. Artif Intell. 1990;46(1–2):77–105.

    Article  Google Scholar 

  26. Willshaw DJ, Buneman OP, Longuet-Higgins HC. Non-holographic associative memory. Nature. 1969;222(5197):960–2.

    Article  CAS  PubMed  Google Scholar 

  27. Rachkovskij DA, Kussul EM. Binding and normalization of binary sparse distributed representations by context-dependent thinning. Neural Comput. 2001;13(2):411–52.

    Article  Google Scholar 

  28. Snaider J, Franklin S. Extended sparse distributed memory and sequence storage. Cognit Comput. 2012;4(2):172–80.

    Article  Google Scholar 

  29. Snaider J, Franklin S, Strain S, George EO. Integer sparse distributed memory: analysis and results. Neural Netw. 2013;46:144–53.

    Article  PubMed  Google Scholar 

  30. Snaider J, Franklin S, editors. Integer sparse distributed memory. The 25th Florida artificial intelligence research society conference FLAIRS-25. Marco Island, FL; 2012.

  31. Snaider J. Integer sparse distributed memory and modular composite representation. PhD Thesis. University of Memphis, Memphis, TN; 2012.

  32. Serre T, Wolf L, Bileschi S, Riesenhuber M, Poggio T. Robust object recognition with cortex-like mechanisms. IEEE Trans Pattern Anal Mach Intell. 2007;29(3):411–26.

    Article  PubMed  Google Scholar 

  33. George D. How the brain might work: a hierarchical and temporal model for learning and recognition. PhD Thesis. Stanford University; 2008.

  34. Arel I, Rose D, Coop R. DeSTIN: A scalable deep learning architecture with application to high-dimensional robust pattern recognition. In: Proceedings of the AAAI 2009 fall symposium on biologically inspired cognitive architectures (BICA); 2009.

  35. Hinton GE. Learning multiple layers of representation. TRENDS Cogn Sci. 2007;11(10):428–34.

    Article  PubMed  Google Scholar 

  36. Hinton GE, Osindero S, Teh Y. A fast learning algorithm for deep belief nets. Neural Comput. 2006;18:1527–54.

    Article  PubMed  Google Scholar 

  37. Snaider J, McCall R, Franklin S. The LIDA framework as a general tool for AGI. The fourth conference on artificial general intelligence. Mountain View, CA; 2011.

  38. Starzyk JA, He H. Anticipation-based temporal sequences learning in hierarchical structure. IEEE Trans Neural Netw. 2007;18(2):344–58.

    Article  PubMed  Google Scholar 

  39. Sun R, Giles CL. Sequence learning: from recognition and prediction to sequential decision making. IEEE Intell Syst. 2001;16(4):67–70.

    Article  Google Scholar 

Download references

Acknowledgments

We want to thank Steve Strain for his suggestions and comments and for his help in the editing of this manuscript. We are also indebted to several anonymous reviewers whose comments helped us to greatly improve the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Javier Snaider.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Snaider, J., Franklin, S. Modular Composite Representation. Cogn Comput 6, 510–527 (2014). https://doi.org/10.1007/s12559-013-9243-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-013-9243-y

Keywords

Navigation