Abstract
Inspired by the neurons’ differences in membrane time-scales, the multiple timescale recurrent neural network model (MTRNN) adopts the hierarchical architecture with increasing time-scales from bottom to top layers. Based on this idea, the recent adaptive and continuous time recurrent neural networks (ACTRNN) and the gated adaptive continuous time recurrent neural network (GACTRNN) develop the novel learning mechanism on the time-scales. In this paper, we test the performance of GACTRNN using the dataset obtained from a real-world humanoid robot’s object manipulation experiment. By using trainable timescale parameters with the gating mechanism, it can be observed that the GACTRNN can better learn the temporal characteristics of the sequences. Besides, to eliminate the effects of parameters’ overgrowing with a large data-set, we improve the GACTRNN model and propose the MATRNN model. In this model, the sigmoid function is used instead of exponential function. We compare the performances of the CTRNN, GACTRNN and MATRNN models, and find that the GACTRNN and MATRNN models perform better than the CTRNN model with the large-scale dataset. By visualizing the timescales adapting in the training process, we also qualitatively show that the MATRNN model performs better than the GACTRNN model in terms of stability with the dataset.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Arbib, M.A., et al.: Neural Organization: Structure, Function, and Dynamics. MIT Press, Cambridge (1998)
Boemio, A., Fromm, S., Braun, A., Poeppel, D.: Hierarchical and asymmetric temporal sensitivity in human auditory cortices. Nat. Neurosci. 8(3), 389–395 (2005)
Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)
Doya, K., Yoshizawa, S.: Adaptive neural oscillator using continuous-time back-propagation learning. Neural Netw. 2(5), 375–385 (1989)
Felleman, D.J., Van Essen, D.C.: Distributed hierarchical processing in the primate cerebral cortex. Cereb. Cortex (New York, NY: 1991) 1(1), 1–47 (1991)
Fuster, J.M.: The prefrontal cortex-an update: time is of the essence. Neuron 30(2), 319–333 (2001)
Gao, R., van den Brink, R.L., Pfeffer, T., Voytek, B.: Neuronal timescales are functionally dynamic and shaped by cortical microarchitecture. Elife 9, e61277 (2020)
He, B.J.: Scale-free brain activity: past, present, and future. Trends Cogn. Sci. 18(9), 480–487 (2014)
Heinrich, S., Alpay, T., Nagai, Y.: Learning timescales in gated and adaptive continuous time recurrent neural networks. In: 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 2662–2667. IEEE (2020)
Heinrich, S., Alpay, T., Wermter, S.: Adaptive and variational continuous time recurrent neural networks. In: 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob), pp. 13–18. IEEE (2018)
Hilgetag, C.C., O’Neill, M.A., Young, M.P.: Hierarchical organization of macaque and cat cortical sensory systems explored with a novel network processor. Philos. Trans. R. Soc. Lond. Ser. B: Biol. Sci. 355(1393), 71–89 (2000)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Hopfield, J.J., Tank, D.W.: Computing with neural circuits: a model. Science 233(4764), 625–633 (1986)
Metta, G., Sandini, G., Vernon, D., Natale, L., Nori, F.: The iCub humanoid robot: an open platform for research in embodied cognition. In: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pp. 50–56. Association for Computing Machinery (2008)
Pfeifer, R., Lungarella, M., Iida, F.: Self-organization, embodiment, and biologically inspired robotics. Science 318, 1088–1093 (2007)
Spitmaan, M., Seo, H., Lee, D., Soltani, A.: Multiple timescales of neural dynamics and integration of task-relevant signals across cortex. Proc. Natl. Acad. Sci. 117(36), 22522–22531 (2020)
Tsagarakis, N.G., et al.: iCub: the design and realization of an open humanoid platform for cognitive and neuroscience research. Adv. Robot. 21(10), 1151–1175 (2007)
Varela, F.J., Thompson, E., Rosch, E.: The Embodied Mind, Revised Edition: Cognitive Science and Human Experience. MIT Press, Cambridge (2017)
Yamashita, Y., Tani, J.: Emergence of functional hierarchy in a multiple timescale neural network model: a humanoid robot experiment. PLoS Comput. Biol. 4(11), e1000220 (2008)
Zhong, J.: Artificial neural models for feedback pathways for sensorimotor integration. Ph.D. thesis, Staats-und Universitätsbibliothek Hamburg Carl von Ossietzky (2015)
Zhong, J., Peniak, M., Tani, J., Ogata, T., Cangelosi, A.: Sensorimotor input as a language generalisation tool: a neurorobotics model for generation and generalisation of noun-verb combinations with sensorimotor inputs. Auton. Robot. 43(5), 1271–1290 (2019)
Acknowledgments
This work is partially supported by PolyU Start-up Grant (ZVUY-P0035417).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhao, L., Zhong, J. (2021). Recurrent Neural Network with Adaptive Gating Timescales Mechanisms for Language and Action Learning. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Communications in Computer and Information Science, vol 1517. Springer, Cham. https://doi.org/10.1007/978-3-030-92310-5_47
Download citation
DOI: https://doi.org/10.1007/978-3-030-92310-5_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-92309-9
Online ISBN: 978-3-030-92310-5
eBook Packages: Computer ScienceComputer Science (R0)