Skip to main content

Continuous Attractors of Nonlinear Neural Networks with Asymmetric Connection Weights

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11302))

Included in the following conference series:

  • 2223 Accesses

Abstract

Continuous attractor neural networks with symmetric connection weights have been studied widely. However, there is short of results on continuous attractor neural networks with asymmetric connection weights. This paper studies the networks with asymmetric connection weights. To overcome the difficulties caused by asymmetric connection weights, an interesting new norm is proposed in a general vector space which is not Euclidean space. Then the new distance and attractivity are defined. Finally, the explicit expressions of continuous attractors of Cellular Neural Networks with asymmetric connection weights are obtained.

This work is supported by National Natural Science Foundation of China under Grant 61572112, 61103041, 61432012, 61803228, the Fundamental Research Funds for the Central Universities under Grant ZYGX2016J136.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhang, R., Zeng, D., Zhong, S.: Event-triggered sampling control for stability and stabilization of memristive neural networks with communication delays. Appl. Math. Comput. 310, 57–74 (2017)

    MathSciNet  Google Scholar 

  2. Li, D., Leng, J., Huang, T., Sun, G.: On sum and stability of G-frames in Hilbert spaces. Linear Multilinear Algebra 66(8), 1578–1592 (2018)

    Article  MathSciNet  Google Scholar 

  3. Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. USA 81(10), 3088–3092 (1984)

    Article  Google Scholar 

  4. Seung, H.S.: Continouous attractors and oculomotor control. Neural Netw. 11, 1253–1258 (1998)

    Article  Google Scholar 

  5. Xu, F., Yi, Z.: Continuous attractors of a class of neural networks with a large number of neurons. Comput. Math. Appl. 62(10), 3785–3795 (2011)

    Article  MathSciNet  Google Scholar 

  6. Wu, S., Wong, K.Y., Fung, C.C., Mi, Y., Zhang, W.: Continuous attractor neural networks candidate of a canonical model for neural information representation. F1000Research 5, 1–9 (2016)

    Article  Google Scholar 

  7. Yoon, K.J., Buice, M.A., Barry, C., Hayman, R., Burgess, N., Fiete, I.R.: Specific evidence of low-dimensional continuous attractor dynamics in grid cells. Nature Neurosci. 16(8), 1077–1084 (2013)

    Article  Google Scholar 

  8. Yi, Z.: Foundations of implementing the competitive layer model by Lotka-Volterra recurrent neural networks. IEEE Trans. Neural Networks 21, 494–507 (2010)

    Article  Google Scholar 

  9. Shuang, W., Lan, S.: Maximum principle for partially-observed optimal control problems of stochastic delay systems. J. Syst. Sci. Complexity 30, 316–328 (2017)

    Article  MathSciNet  Google Scholar 

  10. Zhang, K.C.: Representation of spatial orientation by the intrinsic dynamics of the head-direction cell ensemble: a theory. J. Neurosci. 16, 2112–2126 (1996)

    Article  Google Scholar 

  11. Seung, H.S., Lee, D.D.: The manifold ways of perception. Science 290, 2268–2269 (2000)

    Article  Google Scholar 

  12. Stringer, S.M., Rolls, E.T., Trappenberg, T.P., Araujo, I.E.T.: Self-organizing continuous attractor networks and motor function. Neural Netw. 16, 161–182 (2003)

    Article  Google Scholar 

  13. Robinson, D.A.: Integrating with neurons. Annu. Rev. Neurosci. 12, 33–45 (1989)

    Article  Google Scholar 

  14. Koulakov, A., Raghavachari, S., Kepecs, A., Lisman, J.E.: Model for a robust neural integrator. Nature Neurosci. 5(8), 775–782 (2002)

    Article  Google Scholar 

  15. Stringer, S.M., Trappenberg, T.P., Rolls, E.T., Araujo, I.E.T.: Self-organizing continuous attractor networks and path integration: one-dimensional models of head direction cells. Network Comput. Neural Syst. 13, 217–242 (2002)

    Article  Google Scholar 

  16. Samsonovich, A., McNaughton, B.L.: Path integration and cognitive mapping in a continuous attractor neural network model. J. Neurosci. 17, 5900–5920 (1997)

    Article  Google Scholar 

  17. Pouget, A., Dayan, P., Zemel, R.: Information processing with population codes. Nat. Rev. Neurosci. 1, 125–132 (2000)

    Article  Google Scholar 

  18. Wu, S., Hamaguchi, K., Amari, S.: Dynamics and computation of continuous attractors. Neural Comput. 20(4), 994–1025 (2007)

    Article  MathSciNet  Google Scholar 

  19. Miller, P.: Analysis of spike statistics in neuronal systems with continuous attractors or multiple, discrete attractor states. Neural Comput. 18, 1268–1317 (2006)

    Article  MathSciNet  Google Scholar 

  20. Oliver, S., Lukas, S., Nolan, M.: Continuous attractor network models of grid cell firing based on excitatory-inhibitory Interactions. J. Physiol. 594(22), 6547–6557 (2016)

    Article  Google Scholar 

  21. Dehyadegary, L., Seyyedsalehi, S., Nejadgholi, I.: Nonlinear enhancement of noisy speech, using continuous attractor dynamics formed in recurrent neural networks. Neurocomputing 74(17), 2716–2724 (2011)

    Article  Google Scholar 

  22. Yu, J., Yi, Z., Zhang, L.: Representations of continuous attractors of recurrent neural networks. IEEE Trans. Neural Networks. 20, 368–372 (2009)

    Article  Google Scholar 

  23. Machens, C.K., Brody, C.D.: Design of continuous attractor networks with monotonic tuning using a symmetry principle. Neural Comput. 20, 452–485 (2008)

    Article  MathSciNet  Google Scholar 

  24. Li, D., Leng, J., Huang, T.: Some Properties of G-frames for hilbert space operators. Oper. Matrices 11(4), 1075–1085 (2017)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jiali Yu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yu, J., Yi, Z., Wang, C., Liao, Y., Pang, Z. (2018). Continuous Attractors of Nonlinear Neural Networks with Asymmetric Connection Weights. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11302. Springer, Cham. https://doi.org/10.1007/978-3-030-04179-3_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04179-3_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04178-6

  • Online ISBN: 978-3-030-04179-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics