Skip to main content

Capacity and Retrieval of a Modular Set of Diluted Attractor Networks with Respect to the Global Number of Neurons

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10305))

Abstract

The modularity and hierarchical structures in associative networks can replicate parallel pattern retrieval and multitasking abilities found in complex neural systems. These properties can be exhibited in an ensemble of diluted Attractor Neural Networks for pattern retrieval. It has been shown in a previous work that this modular structure increases the single attractor storage capacity using a divide-and-conquer approach of subnetwork diluted modules. Each diluted module in the ensemble learns disjoint subsets of unbiased binary patterns. The present article deals with an ensemble of diluted Attractor Neural Networks which is studied for different values of the global number of network units, and their performance is compared with a single fully connected network keeping the same cost (total number of connections). The ensemble system more than doubles the maximal capacity of the single network with the same wiring cost. The presented approach can be useful for engineering applications to limited memory systems such as embedded systems or smartphones.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Agliari, E., Barra, A., Galluzzi, A., Guerra, F., Moauro, F.: Multitasking associative networks. Phys. Rev. Lett. 109(26), 268101 (2012)

    Article  Google Scholar 

  2. Agliari, E., Barra, A., Galluzzi, A., Guerra, F., Tantari, D., Tavani, F.: Hierarchical neural networks perform both serial and parallel processing. Neural Netw. 66, 22–35 (2015)

    Article  MATH  Google Scholar 

  3. Agliari, E., Barra, A., Galluzzi, A., Guerra, F., Tantari, D., Tavani, F.: Metastable states in the hierarchical Dyson model drive parallel processing in the hierarchical Hopfield network. J. Phys. A: Math. Theoret. 48(1), 015001 (2015). http://stacks.iop.org/1751-8121/48/i=1/a=015001

    Article  MathSciNet  MATH  Google Scholar 

  4. Agliari, E., Barra, A., Galluzzi, A., Guerra, F., Tantari, D., Tavani, F.: Retrieval capabilities of hierarchical networks: from Dyson to Hopfield. Phys. Rev. Lett. 114(2), 028103 (2015)

    Article  MATH  Google Scholar 

  5. Amit, D.J., Gutfreund, H., Sompolinsky, H.: Information storage in neural networks with low levels of activity. Phys. Rev. A 35, 2293–2303 (1987)

    Article  MathSciNet  Google Scholar 

  6. Derrida, B., Gardner, E., Zippelius, A.: An exactly solvable asymmetric neural network model. Europhys. Lett. 4, 167–173 (1987)

    Article  Google Scholar 

  7. Dominguez, D., Koroutchev, K., Serrano, E., Rodríguez, F.B.: Information and topology in attractor neural network. Neural Comput. 19(4), 956–973 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dominguez, D., González, M., Rodríguez, F.B., Serrano, E., Erichsen, R., Theumann, W.K.: Structured information in sparse-code metric neural networks. Phys. A: Stat. Mech. Appl. 391(3), 799–808 (2012). http://www.sciencedirect.com/science/article/pii/S0378437111007187

    Article  Google Scholar 

  9. Dominguez, D., González, M., Serrano, E., Rodríguez, F.B.: Structured information in small-world neural networks. Phys. Rev. E 79(2), 021909 (2009)

    Article  Google Scholar 

  10. Erdös, P., Rényi, A.: On random graphs. I. Publicationes Math. (Debrecen) 6, 290–297 (1959). http://www.renyi.hu/~p_erdos/Erdos.html#1959-11

  11. González, M., Dominguez, D., Rodríguez, F.B.: Block attractor in spatially organized neural networks. Neurocomputing 72, 3795–3801 (2009)

    Article  Google Scholar 

  12. Gonzalez, M., Dominguez, D., Rodriguez, F.B., Sanchez, A.: Retrieval of noisy fingerprint patterns using metric attractor networks. Int. J. Neural Syst. 24(07), 1450025 (2014)

    Article  Google Scholar 

  13. González, M., Dominguez, D., Sánchez, Á.: Learning sequences of sparse correlated patterns using small-world attractor neural networks: an application to traffic videos. Neurocomputing 74(14–15), 2361–2367 (2011). http://www.sciencedirect.com/science/article/pii/S092523121100155X

    Article  Google Scholar 

  14. González, M., Dominguez, D., Sánchez, A., Rodríguez, F.B.: Increase attractor capacity using an ensembled neural network. Expert Syst. Appl. 71, 206–215 (2017). http://www.sciencedirect.com/science/article/pii/S0957417416306704

    Article  Google Scholar 

  15. González, M., del Mar Alonso-Almeida, M., Avila, C., Dominguez, D.: Modeling sustainability report scoring sequences using an attractor network. Neurocomputing 168, 1181–1187 (2015). http://www.sciencedirect.com/science/article/pii/S0925231215006219

    Article  Google Scholar 

  16. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79(8), 2554–2558 (1982). http://www.pnas.org/cgi/content/abstract/79/8/2554

    Article  MathSciNet  Google Scholar 

  17. Sollich, P., Tantari, D., Annibale, A., Barra, A.: Extensive parallel processing on scale-free networks. Phys. Rev. Lett. 113(23), 238106 (2014)

    Article  Google Scholar 

Download references

Acknowledgments

This work was funded by the Spanish projects of Ministerio de Economía y Competitividad TIN-2010-19607, TIN2014-54580-R, TIN2014-57458-R (http://www.mineco.gob.es/); and UNEMI-2016-CONV-P-01-01 and DITC-UDLA, Ecuador. The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Mario González or Francisco B. Rodríguez .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

González, M., Dominguez, D., Sánchez, Á., Rodríguez, F.B. (2017). Capacity and Retrieval of a Modular Set of Diluted Attractor Networks with Respect to the Global Number of Neurons. In: Rojas, I., Joya, G., Catala, A. (eds) Advances in Computational Intelligence. IWANN 2017. Lecture Notes in Computer Science(), vol 10305. Springer, Cham. https://doi.org/10.1007/978-3-319-59153-7_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-59153-7_43

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-59152-0

  • Online ISBN: 978-3-319-59153-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics