Skip to main content

Recurrent Neural Network with Adaptive Gating Timescales Mechanisms for Language and Action Learning

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1517))

Included in the following conference series:

  • 1817 Accesses

Abstract

Inspired by the neurons’ differences in membrane time-scales, the multiple timescale recurrent neural network model (MTRNN) adopts the hierarchical architecture with increasing time-scales from bottom to top layers. Based on this idea, the recent adaptive and continuous time recurrent neural networks (ACTRNN) and the gated adaptive continuous time recurrent neural network (GACTRNN) develop the novel learning mechanism on the time-scales. In this paper, we test the performance of GACTRNN using the dataset obtained from a real-world humanoid robot’s object manipulation experiment. By using trainable timescale parameters with the gating mechanism, it can be observed that the GACTRNN can better learn the temporal characteristics of the sequences. Besides, to eliminate the effects of parameters’ overgrowing with a large data-set, we improve the GACTRNN model and propose the MATRNN model. In this model, the sigmoid function is used instead of exponential function. We compare the performances of the CTRNN, GACTRNN and MATRNN models, and find that the GACTRNN and MATRNN models perform better than the CTRNN model with the large-scale dataset. By visualizing the timescales adapting in the training process, we also qualitatively show that the MATRNN model performs better than the GACTRNN model in terms of stability with the dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Arbib, M.A., et al.: Neural Organization: Structure, Function, and Dynamics. MIT Press, Cambridge (1998)

    Google Scholar 

  2. Boemio, A., Fromm, S., Braun, A., Poeppel, D.: Hierarchical and asymmetric temporal sensitivity in human auditory cortices. Nat. Neurosci. 8(3), 389–395 (2005)

    Article  Google Scholar 

  3. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)

  4. Doya, K., Yoshizawa, S.: Adaptive neural oscillator using continuous-time back-propagation learning. Neural Netw. 2(5), 375–385 (1989)

    Article  Google Scholar 

  5. Felleman, D.J., Van Essen, D.C.: Distributed hierarchical processing in the primate cerebral cortex. Cereb. Cortex (New York, NY: 1991) 1(1), 1–47 (1991)

    Google Scholar 

  6. Fuster, J.M.: The prefrontal cortex-an update: time is of the essence. Neuron 30(2), 319–333 (2001)

    Article  Google Scholar 

  7. Gao, R., van den Brink, R.L., Pfeffer, T., Voytek, B.: Neuronal timescales are functionally dynamic and shaped by cortical microarchitecture. Elife 9, e61277 (2020)

    Article  Google Scholar 

  8. He, B.J.: Scale-free brain activity: past, present, and future. Trends Cogn. Sci. 18(9), 480–487 (2014)

    Article  Google Scholar 

  9. Heinrich, S., Alpay, T., Nagai, Y.: Learning timescales in gated and adaptive continuous time recurrent neural networks. In: 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 2662–2667. IEEE (2020)

    Google Scholar 

  10. Heinrich, S., Alpay, T., Wermter, S.: Adaptive and variational continuous time recurrent neural networks. In: 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob), pp. 13–18. IEEE (2018)

    Google Scholar 

  11. Hilgetag, C.C., O’Neill, M.A., Young, M.P.: Hierarchical organization of macaque and cat cortical sensory systems explored with a novel network processor. Philos. Trans. R. Soc. Lond. Ser. B: Biol. Sci. 355(1393), 71–89 (2000)

    Article  Google Scholar 

  12. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  13. Hopfield, J.J., Tank, D.W.: Computing with neural circuits: a model. Science 233(4764), 625–633 (1986)

    Article  Google Scholar 

  14. Metta, G., Sandini, G., Vernon, D., Natale, L., Nori, F.: The iCub humanoid robot: an open platform for research in embodied cognition. In: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pp. 50–56. Association for Computing Machinery (2008)

    Google Scholar 

  15. Pfeifer, R., Lungarella, M., Iida, F.: Self-organization, embodiment, and biologically inspired robotics. Science 318, 1088–1093 (2007)

    Article  Google Scholar 

  16. Spitmaan, M., Seo, H., Lee, D., Soltani, A.: Multiple timescales of neural dynamics and integration of task-relevant signals across cortex. Proc. Natl. Acad. Sci. 117(36), 22522–22531 (2020)

    Article  Google Scholar 

  17. Tsagarakis, N.G., et al.: iCub: the design and realization of an open humanoid platform for cognitive and neuroscience research. Adv. Robot. 21(10), 1151–1175 (2007)

    Article  Google Scholar 

  18. Varela, F.J., Thompson, E., Rosch, E.: The Embodied Mind, Revised Edition: Cognitive Science and Human Experience. MIT Press, Cambridge (2017)

    Book  Google Scholar 

  19. Yamashita, Y., Tani, J.: Emergence of functional hierarchy in a multiple timescale neural network model: a humanoid robot experiment. PLoS Comput. Biol. 4(11), e1000220 (2008)

    Article  Google Scholar 

  20. Zhong, J.: Artificial neural models for feedback pathways for sensorimotor integration. Ph.D. thesis, Staats-und Universitätsbibliothek Hamburg Carl von Ossietzky (2015)

    Google Scholar 

  21. Zhong, J., Peniak, M., Tani, J., Ogata, T., Cangelosi, A.: Sensorimotor input as a language generalisation tool: a neurorobotics model for generation and generalisation of noun-verb combinations with sensorimotor inputs. Auton. Robot. 43(5), 1271–1290 (2019)

    Article  Google Scholar 

Download references

Acknowledgments

This work is partially supported by PolyU Start-up Grant (ZVUY-P0035417).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Libo Zhao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhao, L., Zhong, J. (2021). Recurrent Neural Network with Adaptive Gating Timescales Mechanisms for Language and Action Learning. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Communications in Computer and Information Science, vol 1517. Springer, Cham. https://doi.org/10.1007/978-3-030-92310-5_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-92310-5_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-92309-9

  • Online ISBN: 978-3-030-92310-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics