Abstract
Continuous attractor neural networks with symmetric connection weights have been studied widely. However, there is short of results on continuous attractor neural networks with asymmetric connection weights. This paper studies the networks with asymmetric connection weights. To overcome the difficulties caused by asymmetric connection weights, an interesting new norm is proposed in a general vector space which is not Euclidean space. Then the new distance and attractivity are defined. Finally, the explicit expressions of continuous attractors of Cellular Neural Networks with asymmetric connection weights are obtained.
This work is supported by National Natural Science Foundation of China under Grant 61572112, 61103041, 61432012, 61803228, the Fundamental Research Funds for the Central Universities under Grant ZYGX2016J136.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Zhang, R., Zeng, D., Zhong, S.: Event-triggered sampling control for stability and stabilization of memristive neural networks with communication delays. Appl. Math. Comput. 310, 57–74 (2017)
Li, D., Leng, J., Huang, T., Sun, G.: On sum and stability of G-frames in Hilbert spaces. Linear Multilinear Algebra 66(8), 1578–1592 (2018)
Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. USA 81(10), 3088–3092 (1984)
Seung, H.S.: Continouous attractors and oculomotor control. Neural Netw. 11, 1253–1258 (1998)
Xu, F., Yi, Z.: Continuous attractors of a class of neural networks with a large number of neurons. Comput. Math. Appl. 62(10), 3785–3795 (2011)
Wu, S., Wong, K.Y., Fung, C.C., Mi, Y., Zhang, W.: Continuous attractor neural networks candidate of a canonical model for neural information representation. F1000Research 5, 1–9 (2016)
Yoon, K.J., Buice, M.A., Barry, C., Hayman, R., Burgess, N., Fiete, I.R.: Specific evidence of low-dimensional continuous attractor dynamics in grid cells. Nature Neurosci. 16(8), 1077–1084 (2013)
Yi, Z.: Foundations of implementing the competitive layer model by Lotka-Volterra recurrent neural networks. IEEE Trans. Neural Networks 21, 494–507 (2010)
Shuang, W., Lan, S.: Maximum principle for partially-observed optimal control problems of stochastic delay systems. J. Syst. Sci. Complexity 30, 316–328 (2017)
Zhang, K.C.: Representation of spatial orientation by the intrinsic dynamics of the head-direction cell ensemble: a theory. J. Neurosci. 16, 2112–2126 (1996)
Seung, H.S., Lee, D.D.: The manifold ways of perception. Science 290, 2268–2269 (2000)
Stringer, S.M., Rolls, E.T., Trappenberg, T.P., Araujo, I.E.T.: Self-organizing continuous attractor networks and motor function. Neural Netw. 16, 161–182 (2003)
Robinson, D.A.: Integrating with neurons. Annu. Rev. Neurosci. 12, 33–45 (1989)
Koulakov, A., Raghavachari, S., Kepecs, A., Lisman, J.E.: Model for a robust neural integrator. Nature Neurosci. 5(8), 775–782 (2002)
Stringer, S.M., Trappenberg, T.P., Rolls, E.T., Araujo, I.E.T.: Self-organizing continuous attractor networks and path integration: one-dimensional models of head direction cells. Network Comput. Neural Syst. 13, 217–242 (2002)
Samsonovich, A., McNaughton, B.L.: Path integration and cognitive mapping in a continuous attractor neural network model. J. Neurosci. 17, 5900–5920 (1997)
Pouget, A., Dayan, P., Zemel, R.: Information processing with population codes. Nat. Rev. Neurosci. 1, 125–132 (2000)
Wu, S., Hamaguchi, K., Amari, S.: Dynamics and computation of continuous attractors. Neural Comput. 20(4), 994–1025 (2007)
Miller, P.: Analysis of spike statistics in neuronal systems with continuous attractors or multiple, discrete attractor states. Neural Comput. 18, 1268–1317 (2006)
Oliver, S., Lukas, S., Nolan, M.: Continuous attractor network models of grid cell firing based on excitatory-inhibitory Interactions. J. Physiol. 594(22), 6547–6557 (2016)
Dehyadegary, L., Seyyedsalehi, S., Nejadgholi, I.: Nonlinear enhancement of noisy speech, using continuous attractor dynamics formed in recurrent neural networks. Neurocomputing 74(17), 2716–2724 (2011)
Yu, J., Yi, Z., Zhang, L.: Representations of continuous attractors of recurrent neural networks. IEEE Trans. Neural Networks. 20, 368–372 (2009)
Machens, C.K., Brody, C.D.: Design of continuous attractor networks with monotonic tuning using a symmetry principle. Neural Comput. 20, 452–485 (2008)
Li, D., Leng, J., Huang, T.: Some Properties of G-frames for hilbert space operators. Oper. Matrices 11(4), 1075–1085 (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Yu, J., Yi, Z., Wang, C., Liao, Y., Pang, Z. (2018). Continuous Attractors of Nonlinear Neural Networks with Asymmetric Connection Weights. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11302. Springer, Cham. https://doi.org/10.1007/978-3-030-04179-3_35
Download citation
DOI: https://doi.org/10.1007/978-3-030-04179-3_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04178-6
Online ISBN: 978-3-030-04179-3
eBook Packages: Computer ScienceComputer Science (R0)