Skip to main content

Two-Stage Attention Network for Aspect-Level Sentiment Classification

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11304))

Abstract

Currently, most of attention-based works adopt single-stage attention processes during generating context representations toward aspect, but their work lacks the deliberation process: A generated and aspect-related representation is directly used as final output without further polishing. In this work, we introduce the deliberation process to model context for further polishing of attention weights, and then propose a two-stage attention network for aspect-level sentiment classification. The network uses of a two-level attention model with LSTM, where the first-stage attention generates a raw aspect-related representation and the second-stage attention polishes and refines the raw representation by deliberation process. Since the deliberation component has global information what the representation to be generated might be, it has the potential to generate a better aspect-related representation by secondly looking into hidden state produced by LSTM. Experimental results on the dataset of SemEval-2016 task 5 about Laptop indicates that our model achieved the state-of-the-art accuracy of 76.56%.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    www.glove.com.

  2. 2.

    Notes that this paper doesn’t describe the detail of these models because we don’t find liberal about them. The performance of these model is from http://alt.qcri.org/semeval2016/task5/index.php?id=data-and-tools.

References

  1. Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. Computer Science (2015)

    Google Scholar 

  2. Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: 15th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2016)

    Google Scholar 

  3. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)

  4. Ruder, S., Ghaffari, P., Breslin, J.: A hierarchical model of reviews for aspect-based sentiment analysis. arXiv preprint arXiv:1609.02745 (2016)

  5. Tay, Y., Tuan, L. A., Hui, S.: Dyadic memory networks for aspect-based sentiment analysis. In: 2017 International Conference on Information and Knowledge Management, pp. 107–116 (2017)

    Google Scholar 

  6. Wang, W., Pan, S.J., Dahlmeier, D., Xiao, X.: Recursive neural conditional random fields for aspect-based sentiment analysis. arXiv preprint arXiv:1603.06679 (2016)

  7. Wang, S., Jiang, J.: Machine comprehension using match-LSTM and answer pointer. arXiv preprint arXiv:1608.07905 (2016)

  8. Tang, D., Qin, B., Feng, X., Liu, T.: Effective LSTMs for target-dependent sentiment classification. Computer Science (2015)

    Google Scholar 

  9. Xia, Y., et al.: Deliberation networks: sequence generation beyond one-pass decoding. In: Advances in Neural Information Processing Systems, pp. 1782–1792 (2017)

    Google Scholar 

  10. Zhang, L., Wang, S., Liu, B.: Deep learning for sentiment analysis: a survey. Wiley Interdisciplinary Reviews Data Mining & Knowledge Discovery (2018)

    Google Scholar 

  11. Wang, Y., Huang, M., Zhao, L., Zhu, X.: Attention-based LSTM for aspect-level sentiment classification. In: 2017 Conference on Empirical Methods on Natural Language Processing, pp. 606–615 (2016)

    Google Scholar 

  12. Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. arXiv preprint arXiv:1605.08900 (2016)

  13. Lecun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436 (2015)

    Article  Google Scholar 

  14. Qian, Q., Huang, M., Lei, J., Zhu, X.: Linguistically regularized LSTM for sentiment classification. arXiv preprint arXiv:1611.03949 (2016)

Download references

Acknowledgments

This paper is sponsored by National Science Foundation of China (61673235, 61772075) and National Science Foundation of Hebei Province (F2017208012). It is also sponsored by the Key Research Project for Hebei University of Science & Technology (2016ZDYY03) and Graduated Student Innovation Project of Hebei Province (CXZZSS2017095).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hua Xu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gao, K., Xu, H., Gao, C., Sun, X., Deng, J., Zhang, X. (2018). Two-Stage Attention Network for Aspect-Level Sentiment Classification. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11304. Springer, Cham. https://doi.org/10.1007/978-3-030-04212-7_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04212-7_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04211-0

  • Online ISBN: 978-3-030-04212-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics