Skip to main content

A Nonparametric N-Gram Topic Model with Interpretable Latent Topics

  • Conference paper
Information Retrieval Technology (AIRS 2013)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8281))

Included in the following conference series:

Abstract

Most nonparametric topic models such as Hierarchical Dirichlet Processes, when viewed as an infinite-dimensional extension to the Latent Dirichlet Allocation, rely on the bag-of-words assumption. They thus lose the semantic ordering of the words inherent in the text which can give an extra leverage to the computational model. We present a new nonparametric topic model that not only maintains the word order in the topic discovery process, but also generates topical n-gram words leading to more interpretable latent topics in the family of the nonparametric topic models. Our experimental results show an improved performance over the current state-of-the-art topic models in document modeling and generating n-gram words in topics.

The work described in this paper is substantially supported by grants from the Research Grant Council of the Hong Kong Special Administrative Region, China (Project Code: CUHK413510) and the Direct Grant of the Faculty of Engineering, CUHK (Project Code: 2050522). This work is also affiliated with the CUHK MoE-Microsoft Key Laboratory of Human-centric Computing and Interface Technologies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aldous, D.: Exchangeability and related topics. Ecole d’Ete de Probabilites de Saint-Flour XIII-1983, pp. 1–198 (1985)

    Google Scholar 

  2. Barbieri, N., Manco, G., Ritacco, E., Carnuccio, M., Bevacqua, A.: Probabilistic topic models for sequence data. Machine Learning 93(1), 5–29 (2013)

    Article  Google Scholar 

  3. Blei, D.M.: Probabilistic topic models. Commun. ACM 55(4), 77–84 (2012)

    Article  MathSciNet  Google Scholar 

  4. Blei, D., Ng, A., Jordan, M.: Latent Dirichlet allocation. JMLR 3, 993–1022 (2003)

    MATH  Google Scholar 

  5. Blunsom, P., Cohn, T., Goldwater, S., Johnson, M.: A note on the implementation of hierarchical Dirichlet processes. In: Proc. of ACL-IJCNLP, pp. 337–340 (2009)

    Google Scholar 

  6. Boyd-Graber, J., Blei, D.M.: Syntactic topic models. In: Proc. of NIPS (2008)

    Google Scholar 

  7. Caballero, K.L., Barajas, J., Akella, R.: The generalized Dirichlet distribution in enhanced topic detection. In: Proc. of CIKM, pp. 773–782 (2012)

    Google Scholar 

  8. Chen, S.F., Goodman, J.: An empirical study of smoothing techniques for language modeling. In: Proc. of ACL, pp. 310–318 (1996)

    Google Scholar 

  9. Claeskens, G., Hjort, N.: Model selection and model averaging. Cambridge Books (1993)

    Google Scholar 

  10. Darling, W.: Generalized Probabilistic Topic and Syntax Models for Natural Language Processing. Ph.D. thesis (2012)

    Google Scholar 

  11. Deane, P.: A nonparametric method for extraction of candidate phrasal terms. In: Proc. of ACL, pp. 605–613 (2005)

    Google Scholar 

  12. Fang, Y., Si, L., Somasundaram, N., Yu, Z.: Mining contrastive opinions on political texts using cross-perspective topic model. In: Proc. of WSDM, pp. 63–72 (2012)

    Google Scholar 

  13. Fox, E., Sudderth, E., Jordan, M., Willsky, A.: A sticky HDP-HMM with application to speaker diarization. The Annals of Applied Statistics 5(2A), 1020–1056 (2011)

    Google Scholar 

  14. Goldwater, S., Griffiths, T., Johnson, M.: A Bayesian framework for word segmentation: Exploring the effects of context. Cognition 112(1), 21–54 (2009)

    Article  Google Scholar 

  15. Goldwater, S., Griffiths, T.L., Johnson, M.: Contextual dependencies in unsupervised word segmentation. In: Proc. of ACL, pp. 673–680 (2006)

    Google Scholar 

  16. Goldwater, S., Griffiths, T., Johnson, M.: Interpolating between types and tokens by estimating power-law generators. In: Proc. of NIPS, vol. 18, p. 459 (2006)

    Google Scholar 

  17. Griffiths, T.L., Steyvers, M., Blei, D., Tenenbaum, J.: Integrating topics and syntax. In: Proc. of NIPS, vol. 17, pp. 537–544 (2005)

    Google Scholar 

  18. Griffiths, T., Steyvers, M., Tenenbaum, J.: Topics in semantic representation. Psychological Review 114(2), 211–244 (2007)

    Article  Google Scholar 

  19. Gruber, A., Rosen-Zvi, M., Weiss, Y.: Hidden topic Markov models. In: Proc. of AISTATS (2007)

    Google Scholar 

  20. Johnson, M.: PCFGs, topic models, adaptor grammars and learning topical collocations and the structure of proper names. In: Proc. of ACL, pp. 1148–1157 (2010)

    Google Scholar 

  21. Kim, H.D., Park, D.H., Lu, Y., Zhai, C.: Enriching text representation with frequent pattern mining for probabilistic topic modeling. JASIST 49(1), 1–10 (2012)

    MATH  Google Scholar 

  22. Lau, J.H., Baldwin, T., Newman, D.: On collocations and topic models. ACM Trans. Speech Lang. Process. 10(3), 10:1–10:14 (2013)

    Google Scholar 

  23. Lindsey, R.V., Headden, W.P., Stipicevic, M.J.: A phrase-discovering topic model using hierarchical Pitman-Yor processes. In: Proc. of EMNLP, pp. 214–222 (2012)

    Google Scholar 

  24. McCallum, A., Wang, X.: A note on topical N-grams. Department of Computer Science, University of Massachusetts, Amherst (2005)

    Google Scholar 

  25. Petrović, S., Śnajder, J., Baśić, B.: Extending lexical association measures for collocation extraction. Computer Speech & Language 24(2), 383–394 (2010)

    Article  Google Scholar 

  26. Steyvers, M., Griffiths, T.: Probabilistic topic models. Handbook of Latent Semantic Analysis 427(7), 424–440 (2007)

    Google Scholar 

  27. Teh, Y.: A hierarchical Bayesian language model based on Pitman-Yor processes. In: Proc. of ACL, pp. 985–992 (2006)

    Google Scholar 

  28. Teh, Y., Jordan, M., Beal, M., Blei, D.: Hierarchical Dirichlet processes. JASA 101(476), 1566–1581 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  29. Wallach, H.M.: Structured topic models for language. Ph.D. thesis (2008)

    Google Scholar 

  30. Wallach, H.: Topic modeling: beyond bag-of-words. In: Proc. of ICML, pp. 977–984 (2006)

    Google Scholar 

  31. Wang, X., McCallum, A., Wei, X.: Topical N-grams: Phrase and topic discovery, with an application to information retrieval. In: Proc. of ICDM, pp. 697–702 (2007)

    Google Scholar 

  32. Wood, F., Teh, Y.W.: A hierarchical nonparametric Bayesian approach to statistical language model domain adaptation. Journal of Machine Learning 5, 607–614 (2009)

    Google Scholar 

  33. Yoshii, K., Goto, M.: A vocabulary-free infinity-gram model for nonparametric Bayesian chord progression analysis. In: Proc. of ISMIR (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jameel, S., Lam, W. (2013). A Nonparametric N-Gram Topic Model with Interpretable Latent Topics. In: Banchs, R.E., Silvestri, F., Liu, TY., Zhang, M., Gao, S., Lang, J. (eds) Information Retrieval Technology. AIRS 2013. Lecture Notes in Computer Science, vol 8281. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-45068-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-45068-6_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-45067-9

  • Online ISBN: 978-3-642-45068-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics