Skip to main content

Labeled Phrase Latent Dirichlet Allocation

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10041))

Abstract

In recent years, topic modeling, such as Latent Dirichlet Allocation (LDA) and its variations, has been widely used to discover the abstract topics in text corpora. There are two state-of-the-art topic models: Labeled LDA (LLDA) and PhraseLDA. LLDA is a supervised generative model which considers the label information, but it does not take into consideration word order under the bag-of-words assumption. On the contrary, PhraseLDA regards each document as a mixture of phrases, which partly considers the word order. However, PhraseLDA cannot model the supervised label information. In this paper, in order to overcome the defects of two models above while combining their merits, we propose a novel topic model, called Labeled Phrase LDA, which synchronously considers the supervised information and word order. Lots of experiments were conducted among the proposed model and two state-of-the-art models, which show the proposed model significantly outperforms baselines in terms of case study, perplexity and scalability.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    http://twitter.com/.

  2. 2.

    https://answers.yahoo.com/.

References

  1. Blei, D., Ng, A., Jordan, M.: Latent dirichlet allocation. J. Mach. Learn. Res., 993–1022 (2003)

    Google Scholar 

  2. Ramage, D., et al.: Labeled LDA: a supervised topic model for credit attribution in multi-labeled corpora. Empirical Methods in Natural Language Processing (2009)

    Google Scholar 

  3. Elkishky, A., et al.: Scalable topical phrase mining from text corpora. In: Proceedings of the Vldb Endowment 8.3, pp. 305–316 (2014)

    Google Scholar 

  4. Blei, D., Mcauliffe, J.: Supervised Topic Models. Neural Information Processing Systems (2008)

    Google Scholar 

  5. Lacostejulien, S., Sha, F., Ijordan, M.: DiscLDA: discriminative learning for dimensionality reduction and classification. In: Neural Information Processing Systems (2009)

    Google Scholar 

  6. Ramage, D., et al.: Clustering the tagged web. In: Web Search and Data Mining (2009)

    Google Scholar 

  7. Rosenzvi, M., et al.: The author-topic model for authors and documents. In: Uncertainty in Artificial Intelligence (2004)

    Google Scholar 

  8. Nrubin, T., et al.: Statistical topic models for multi-label document classification. Mach. Learn. 88(1), 157–208 (2012)

    MathSciNet  Google Scholar 

  9. Xiao, H., Wang, X., Du, C.: Injecting structured data to generative topic model in enterprise settings. In: Asian Conference on Machine Learning (2009)

    Google Scholar 

  10. Ramage, D., Dmanning, C., Tdumais, S.: Partially labeled topic models for interpretable text mining. In: Knowledge Discovery and Data Mining (2011)

    Google Scholar 

  11. Wang, X., Mccallum, A., Wei, X.: Topical N-Grams: phrase and topic discovery, with an application to information retrieval. In: International Conference on Data Mining (2007)

    Google Scholar 

  12. Vlindsey, R., Pheadden, W., Jstipicevic, M.: A phrase-discovering topic model using hierarchical pitman-yor processes. In: Empirical Methods in Natural Language Processing (2012)

    Google Scholar 

  13. Xiao, X., et al.: A topic similarity model for hierarchical phrase-based translation (2012)

    Google Scholar 

  14. Wang, C., et al.: A phrase mining framework for recursive construction of a topical hierarchy. In: Knowledge Discovery and Data Mining (2013)

    Google Scholar 

  15. Petinot, Y., Mckeown, K., Thadani, K.: A hierarchical model of web summaries. In: Meeting of the Association for Computational Linguistics (2011)

    Google Scholar 

  16. Perotte, A., et al.: Hierarchically supervised latent Dirichlet allocation. In: Neural Information Processing Systems (2011)

    Google Scholar 

  17. Mao, X., et al.: SSHLDA: a semi-supervised hierarchical topic model. In: Empirical Methods in Natural Language Processing (2012)

    Google Scholar 

  18. Deerwester, S., et al.: Indexing by latent semantic analysis. J. Am. Soc. Inf. Sci. 41(6), 391–407 (1990)

    Article  Google Scholar 

  19. Hofmann, T.: Probabilistic latent semantic indexing. In: International ACM SIGIR Conference on Research and Development in Information Retrieval (1999)

    Google Scholar 

  20. Lgriffiths, T., et al.: Hierarchical topic models and the nested chinese restaurant process. In: Neural Information Processing Systems (2004)

    Google Scholar 

  21. Whyeteh, Y., et al.: Hierarchical Dirichlet processes. J. Am. Stat. Assoc., 1566–1581 (2012)

    Google Scholar 

  22. Li, W., Mccallum, A.: Pachinko allocation (DAG-structured mixture models of topic correlations). In: Machine Learning (2006)

    Google Scholar 

Download references

Acknowledgement

This work was supported by 863 Program (2015AA015404), China National Science Foundation (61402036, 60973083, 61273363), Beijing Technology Project (Z151100001615029), Science and Technology Planning Project of Guangdong Province (2014A010103009, 2015A020217002), Guangzhou Science and Technology Planning Project (201604020179).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xian-Ling Mao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Tang, YK., Mao, XL., Huang, H. (2016). Labeled Phrase Latent Dirichlet Allocation. In: Cellary, W., Mokbel, M., Wang, J., Wang, H., Zhou, R., Zhang, Y. (eds) Web Information Systems Engineering – WISE 2016. WISE 2016. Lecture Notes in Computer Science(), vol 10041. Springer, Cham. https://doi.org/10.1007/978-3-319-48740-3_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-48740-3_39

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-48739-7

  • Online ISBN: 978-3-319-48740-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics