Skip to main content
Log in

Heterogeneous recurrent neural networks for natural language model

  • Original Article
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

Neural networks for language model are proposed and their performances are explored. The proposed network consists of two recurrent networks of which structures are different to each other. Both networks accept words as their inputs, translate their distributed representation, and produce the probabilities of words to occur from their sequence of input words. Performances for the proposed network are investigated through constructions for language models, as compared with a single recurrent neural and a long short-term memory network.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Mikolov T, Karafiát M, Burget L, Cernocký J, Khudanpur S (2010) Recurrent neural network based language model. In: Proceedings of interspeech, pp 1045–1048

  2. Gers FA, Schmidhuber JA, Cummins FA (2000) Learning to Forget: Continual Prediction with LSTM. Neural Comput 12(10):2451–2471

    Article  Google Scholar 

  3. Sundermeyer M, Schlüter R, Ney H (2012) LSTM neural networks for language modeling. In: Proceedings of interspeech

  4. Chung J, Gulcehre C, Cho K, Bengio Y (2015) Gated feedback recurrent neural networks. In: Proceedings of the 32nd international conference on machine learning, pp 2067–2075

  5. Marcus MP, Marcinkiewicz MA, Santorini B (1993) Building a large annotated corpus of English: the penn treebank. Comput Linguist 19(2):313–330

    Google Scholar 

  6. Francis WN, Kucera H (1979) Brown University Standard Corpus of Present-Day American English, revised version. http://clu.uni.no/icame/manuals/BROWN/INDEX.HTM. Accessed 15 Sept 2018

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Teijiro Isokawa.

Additional information

This study was financially supported by Japan Society for the Promotion of Science (Grant-in-Aids for Scientific Research (C) 16K00248 and Scientific Research (C) 16K00337).

This work was presented in part at the 23rd International Symposium on Artificial Life and Robotics, Beppu, Oita, January 18–20, 2018.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tsuji, M., Isokawa, T., Yumoto, T. et al. Heterogeneous recurrent neural networks for natural language model. Artif Life Robotics 24, 245–249 (2019). https://doi.org/10.1007/s10015-018-0507-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-018-0507-1

Keywords

Navigation