Skip to main content

A Sentence-Level Sparse Gamma Topic Model for Sentiment Analysis

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10832))

Abstract

Online consumer reviews have become an essential source of information for understanding markets and customer preferences. This research introduces a novel topic model to identify product attributes and sentiments toward them at the sentence level. The model uses a recursive definition of topic distribution in a sentence to avoid the problem of over-parametrization in topic models. The introduction of the inference network enables the utilization of rich features in the content to drive the identification of sentiments, in contrast with other multi-aspect sentiment analysis models that rely on single words. The sentence topic model has a superior performance in producing coherent topics, and the sentence topic-sentiment model outperforms the existing model on the task of predicting product attribute rating.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Blei, D.M., Lafferty, J.D.: A correlated topic model of science. Ann. Appl. Stat. 1, 17–35 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  2. Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)

    MATH  Google Scholar 

  3. Chen, Y., Xie, J.: Online consumer review: word-of-mouth as a new element of marketing communication mix. Manage. Sci. 54(3), 477–491 (2008)

    Article  Google Scholar 

  4. Chevalier, J.A., Mayzlin, D.: The effect of word of mouth on sales: online book reviews. J. Mark. Res. 43(3), 345–354 (2006)

    Article  Google Scholar 

  5. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  6. Hoffman, M.D., Blei, D.M., Wang, C., Paisley, J.: Stochastic variational inference. J. Mach. Learn. Res. 14(1), 1303–1347 (2013)

    MathSciNet  MATH  Google Scholar 

  7. Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International Conference on Machine Learning, pp. 448–456 (2015)

    Google Scholar 

  8. Lau, J.H., Newman, D., Baldwin, T.: Machine reading tea leaves: automatically evaluating topic coherence and topic model quality. In: Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics, pp. 530–539 (2014)

    Google Scholar 

  9. Liu, B.: Sentiment Analysis and Opinion Mining. Synthesis Lectures on Human Language Technologies, vol. 5(1), pp. 1–167. Morgan & Claypool Publishers, San Rafael (2012)

    Google Scholar 

  10. Ranganath, R., Tang, L., Charlin, L., Blei, D.: Deep exponential families. In: Artificial Intelligence and Statistics, pp. 762–771 (2015)

    Google Scholar 

  11. Rezende, D.J., Mohamed, S., Wierstra, D.: Stochastic backpropagation and approximate inference in deep generative models (2014). arXiv preprint arXiv:1401.4082

  12. Srivastava, A., Sutton, C.: Autoencoding variational inference for topic models (2017). arXiv preprint arXiv:1703.01488

  13. Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  14. Tang, J., Meng, Z., Nguyen, X., Mei, Q., Zhang, M.: Understanding the limiting factors of topic modeling via posterior contraction analysis. In: International Conference on Machine Learning, pp. 190–198 (2014)

    Google Scholar 

  15. Titov, I., McDonald, R.T.: A joint model of text and aspect ratings for sentiment summarization. In: Proceedings of the Conference 46th Annual Meeting of the Association for Computational Linguistics, vol. 8, pp. 308–316. Citeseer (2008)

    Google Scholar 

  16. Wang, H., Ester, M.: A sentiment-aligned topic model for product aspect rating prediction. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (2014)

    Google Scholar 

  17. Wang, H., Lu, Y., Zhai, C.: Latent aspect rating analysis without aspect keyword supervision. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 618–626 (2011)

    Google Scholar 

  18. Yelp: Yelp dataset challenge (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tao Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, T., Parsons, J. (2018). A Sentence-Level Sparse Gamma Topic Model for Sentiment Analysis. In: Bagheri, E., Cheung, J. (eds) Advances in Artificial Intelligence. Canadian AI 2018. Lecture Notes in Computer Science(), vol 10832. Springer, Cham. https://doi.org/10.1007/978-3-319-89656-4_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-89656-4_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-89655-7

  • Online ISBN: 978-3-319-89656-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics