Abstract
Neural networks for language model are proposed and their performances are explored. The proposed network consists of two recurrent networks of which structures are different to each other. Both networks accept words as their inputs, translate their distributed representation, and produce the probabilities of words to occur from their sequence of input words. Performances for the proposed network are investigated through constructions for language models, as compared with a single recurrent neural and a long short-term memory network.









Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Mikolov T, Karafiát M, Burget L, Cernocký J, Khudanpur S (2010) Recurrent neural network based language model. In: Proceedings of interspeech, pp 1045–1048
Gers FA, Schmidhuber JA, Cummins FA (2000) Learning to Forget: Continual Prediction with LSTM. Neural Comput 12(10):2451–2471
Sundermeyer M, Schlüter R, Ney H (2012) LSTM neural networks for language modeling. In: Proceedings of interspeech
Chung J, Gulcehre C, Cho K, Bengio Y (2015) Gated feedback recurrent neural networks. In: Proceedings of the 32nd international conference on machine learning, pp 2067–2075
Marcus MP, Marcinkiewicz MA, Santorini B (1993) Building a large annotated corpus of English: the penn treebank. Comput Linguist 19(2):313–330
Francis WN, Kucera H (1979) Brown University Standard Corpus of Present-Day American English, revised version. http://clu.uni.no/icame/manuals/BROWN/INDEX.HTM. Accessed 15 Sept 2018
Author information
Authors and Affiliations
Corresponding author
Additional information
This study was financially supported by Japan Society for the Promotion of Science (Grant-in-Aids for Scientific Research (C) 16K00248 and Scientific Research (C) 16K00337).
This work was presented in part at the 23rd International Symposium on Artificial Life and Robotics, Beppu, Oita, January 18–20, 2018.
About this article
Cite this article
Tsuji, M., Isokawa, T., Yumoto, T. et al. Heterogeneous recurrent neural networks for natural language model. Artif Life Robotics 24, 245–249 (2019). https://doi.org/10.1007/s10015-018-0507-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10015-018-0507-1