Skip to main content

Learning Word Embeddings for Aspect-Based Sentiment Analysis

  • Conference paper
  • First Online:
Computational Linguistics (PACLING 2017)

Abstract

Nowadays word embeddings, also known as word vectors, play an important role for many NLP tasks. In general, these word representations are learned from an unannotated corpus and they are independent from their applications. In this paper we aim to enrich the word vectors by adding more information derived from an application of them which is the aspect based sentiment analysis. We propose a new model using a combination of unsupervised and supervised techniques to capture the three kinds of information, including the general semantic distributed representation (i.e. the conventional word embeddings), and the aspect category and aspect sentiment from labeled and unlabeled data. We conduct experiments on the restaurant review data (http://spidr-ursa.rutgers.edu/datasets/). Experimental results show that our proposed model outperforms other methods as Word2Vec and GloVe.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://spidr-ursa.rutgers.edu/datasets/.

  2. 2.

    https://www.yelp.com/datasetchallenge/.

  3. 3.

    https://code.google.com/archive/p/Word2Vec/.

  4. 4.

    http://nlp.stanford.edu/projects/glove/.

  5. 5.

    https://github.com/yoonkim/CNN_sentence.

  6. 6.

    https://en.wikipedia.org/wiki/Precisionandrecall.

References

  1. Alghunaim, A., Mohtarami, M., Cyphers, S., Glass, J.: A vector space approach for aspect based sentiment analysis. In: Proceedings of NAACL-HLT 2015, pp. 116–122 (2015)

    Google Scholar 

  2. Bengio, Y., Ducharme, R., Vincent, P., Janvin, C.: A neural probabilistic language model. J. Mach. Learn. Res. 3, 1137–1155 (2003)

    MATH  Google Scholar 

  3. Brody, S., Elhadad, N.: An unsupervised aspect-sentiment model for online reviews. In: Proceedings of NAACL-HLT, pp. 804–812 (2010)

    Google Scholar 

  4. Collobert, R., Weston, J.: A unified architecture for natural language processing. In: Proceedings of the ICML, pp. 160–167 (2008)

    Google Scholar 

  5. Ganu, G., Elhadad, N., Marian, A.: Beyond the stars: improving rating predictions using review text content. In: Proceedings of WebDB, pp. 1–6 (2009)

    Google Scholar 

  6. Kiritchenko, S., Zhu, X., Cherry, C., Mohammad, S.M.: NRC-Canada-2014: detecting aspects and sentiment in customer reviews. In: Proceedings of SemEval, pp. 437–442 (2014)

    Google Scholar 

  7. Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of EMNLP, pp. 1746–1751 (2014)

    Google Scholar 

  8. Maas, A.L., Daly, R.E., Pham, P.T., Huang, D., Nguyen, A.Y., Potts, C.: Learning word vectors for sentiment analysis. In: Proceedings of ACL, pp. 142–150 (2011)

    Google Scholar 

  9. Mikolov, T., Sutskever, I., Chen, K., Corrado, G., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Proceedings of NIPS, pp. 1–9 (2014)

    Google Scholar 

  10. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space, CoRR (2013)

    Google Scholar 

  11. Mnih, A., Hinton, G.E.: A scalable hierarchical distributed language model. In: Proceedings of NIPS, pp. 1081–1088 (2008)

    Google Scholar 

  12. Nguyen-Hoang, B.D., Ha, Q.V., Nghiem, M.Q.: Aspect-based sentiment analysis using word embedding restricted Boltzmann machines. In: Proceedings of CSoNet 2016, pp. 285–297 (2016)

    Google Scholar 

  13. Pennington, J., Socher, R., Manning, C.D.: GloVe: global vectors for word representation. In: Proceedings of EMNLP, pp. 1532–1543 (2014)

    Google Scholar 

  14. Pavlopoulos, J., Androutsopoulos, I.: Aspect term extraction for sentiment analysis: new datasets, new evaluation measures and an improved unsupervised method. In: Proceedings of ACL, pp. 44–52 (2014)

    Google Scholar 

  15. Pham, D.H., Le, A.C., Le, T.K.C.: A least square based model for rating aspects and identifying important aspects on review text data. In: Proceedings of NICS, pp. 16–18 (2015)

    Google Scholar 

  16. Pham, D.H., Le, A.C., Nguyen, T.T.T.: Determining aspect ratings and aspect weights from textual reviews by using neural network with paragraph vector model. In: Proceedings of CSoNet, pp. 309–320 (2016)

    Google Scholar 

  17. Poria, S., Cambria, E., Gelbukh, A.: Aspect extraction for opinion mining with a deep convolutional neural network. Knowl. Based Syst. 108, 42–49 (2016)

    Article  Google Scholar 

  18. Ren, Y., Zhang, Y., Zhang, M., Ji, D.: Improving Twitter sentiment classification using topic-enriched multi-prototype word embeddings. In: Proceedings of AAAI, pp. 3038–3044 (2016)

    Google Scholar 

  19. Rumelhart, D.E., Hintont, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323, 9 (1986)

    Article  Google Scholar 

  20. Tang, D., Qin, B., Liu, T.: Learning sentiment-specific word embedding for Twitter sentiment classification. In: Proceedings of ACL, pp. 1555–1565 (2014)

    Google Scholar 

  21. Turian, J., Ratinov, L., Bengio, Y.: Word representations: a simple and general method for semisupervised learning. In: Proceedings of the ACL, pp. 384–394 (2010)

    Google Scholar 

  22. Wagner, J., Arora, P., Cortes, S., Barman, U., Bogdanova, D., Foster, J., Tounsi, L.: DCU: aspect based polarity classification for semeval task 4. In: Proceedings of SemEval, pp. 223–229 (2014)

    Google Scholar 

  23. Wang, L., Liu, K., Cao, Z., Zhao, J., Melo, G.D.: Sentiment-aspect extraction based on restricted Boltzmann machines. In: Proceedings of ACL, pp. 616–625 (2015)

    Google Scholar 

  24. Zhao, W.X., Jiang, J., Yan, H., Li., X.: Jointly modeling aspects and opinions with a MaxEnt-LDA hybrid. In: Proceedings of EMNLP, pp. 56–65 (2010)

    Google Scholar 

  25. Zhou, X., Wan, X., Xiao, J.: Representation learning for aspect category detection in online reviews. In: Proceedings of AAAI, pp. 417–423 (2015)

    Google Scholar 

Download references

Acknowledgement

This paper is supported by The Vietnam National Foundation for Science and Technology Development (NAFOSTED) under grant number 102.01-2014.22.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anh-Cuong Le .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pham, DH., Le, AC., Le, TKC. (2018). Learning Word Embeddings for Aspect-Based Sentiment Analysis. In: Hasida, K., Pa, W. (eds) Computational Linguistics. PACLING 2017. Communications in Computer and Information Science, vol 781. Springer, Singapore. https://doi.org/10.1007/978-981-10-8438-6_3

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-8438-6_3

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-8437-9

  • Online ISBN: 978-981-10-8438-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics