Skip to main content

Performance Analysis of FFNN-Based Language Model in Contrast with n-Gram

  • Chapter
  • 1066 Accesses

Abstract

In this paper, we analyze the performance of feed forward neural network (FFNN)-based language model in contrast with n-gram. The probability of n-gram language model was estimated based on the statistics of word sequences. The FFNN-based language model was structured by three hidden layers, 500 hidden units per each hidden layer, and 30 dimension word embedding. The performance of FFNN-based language model is better than that of n-gram by 1.5 % in terms of WER on the English WSJ domain.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Arisoy E, Sainath T, Kingsbury B, Ramabhadran B (2012) Deep neural network language models. In: NAACL-HLT 2012 workshop, pp 20–28

    Google Scholar 

  • Bengio Y (2009) Learning deep architectures for AI. J Found Trends Mach Learn 2:1–127

    Article  MATH  Google Scholar 

  • Bengio Y, Ducharme R, Vincent P, Jauvin C (2003) A neural probabilistic language model. J Mach Learn Res 3:1137–1155

    MATH  Google Scholar 

  • Bishop C (1995) Neural networks for pattern recognition. Clarendon, Oxford

    MATH  Google Scholar 

  • Schwenk H, Gauvain J (2002) Connectionist language modeling for large vocabulary continuous speech recognition. In: International conference on acoustics, speech and signal processing, pp 765–768

    Google Scholar 

  • Schwenk H, Gauvain J (2004) Neural network language models for conversational speech recognition. In: International conference on speech and language processing, pp 1215–1218

    Google Scholar 

  • Schwenk H, Gauvain J (2005) Training neural network language models on very large corpora. In: Empirical methods in natural language processing, pp 201–208

    Google Scholar 

Download references

Acknowledgments

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (No. NRF-2014R1A1A1002197).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J. -H. Kim .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Kim, KH. et al. (2015). Performance Analysis of FFNN-Based Language Model in Contrast with n-Gram. In: Lee, G., Kim, H., Jeong, M., Kim, JH. (eds) Natural Language Dialog Systems and Intelligent Assistants. Springer, Cham. https://doi.org/10.1007/978-3-319-19291-8_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-19291-8_25

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-19290-1

  • Online ISBN: 978-3-319-19291-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics