Skip to main content

Surrounding-Based Attention Networks for Aspect-Level Sentiment Classification

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2019: Text and Time Series (ICANN 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11730))

Included in the following conference series:

  • 4555 Accesses

Abstract

Aspect-level sentiment classification aims to identify the polarity of a target word in a sentence. Studies on sentiment classification have found that a target’s surrounding words have great impacts and global attention to the target. However, existing neural-network-based models either depend on expensive phrase-level annotation or do not fully exploit the association of the context words to the target. In this paper, we propose to model the influences of the target’s surrounding words via two unidirectional long short-term memory neural networks, and introduce a target-based attention mechanism to discover the underlying relationship between the target and the context words. Empirical results on the SemEval 2014 Datasets show that our approach outperforms many competitive sentiment classification baseline methods. Detailed analysis demonstrates the effectiveness of the proposed surrounding-based long-short memory neural networks and the target-based attention mechanism.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Detailed information can be found at: http://alt.qcri.org/semeval2014/task4.

  2. 2.

    Available at: https://nlp.stanford.edu/projects/glove/.

References

  1. Dinsoreanu, M., Bacu, A.: Unsupervised Twitter sentiment classification. In: KMIS, pp. 220–227 (2014). https://doi.org/10.5220/0005079002200227

  2. Dong, L., Wei, F., Zhou, M., Xu, K.: Adaptive multi-compositionality for recursive neural models with applications to sentiment analysis. In: AAAI, pp. 1537–1543 (2014). https://doi.org/10.1109/TASLP.2015.2509257

    Article  Google Scholar 

  3. Ganchev, T.D., Parsopoulos, K.E., Vrahatis, M.N., Fakotakis, N.D.: Partially connected locally recurrent probabilistic neural networks (2008). https://doi.org/10.5772/5552

    Google Scholar 

  4. Hearst, M.A., Dumais, S.T., Osuna, E., Platt, J., Scholkopf, B.: Support vector machines. IEEE Intell. Syst. Their Appl. 13(4), 18–28 (1998). https://doi.org/10.1016/j.neunet.2010.01.002

    Article  Google Scholar 

  5. Heeger, D.J., Gandhi, S.P., Huk, A.C., Boynton, G.M.: Neuronal correlates of attention in human visual cortex, pp. 25–47. The MIT Press, Cambridge (2001). https://doi.org/10.7551/mitpress/7125.003.0004

  6. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.4324/9781315174105-4

    Article  Google Scholar 

  7. Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014). https://doi.org/10.3115/v1/d14-1181

  8. Kurata, G., Ramabhadran, B., Saon, G., Sethy, A.: Language modeling with highway LSTM. In: ASRU, pp. 244–251. IEEE (2017). https://doi.org/10.1109/asru.2017.8268942

  9. Liu, J., Zhang, Y.: Attention modeling for targeted sentiment. In: EACL, vol. 2, pp. 572–577 (2017). https://doi.org/10.18653/v1/e17-2091

  10. Ma, D., Li, S., Zhang, X., Wang, H.: Interactive attention networks for aspect-level sentiment classification. arXiv preprint arXiv:1709.00893 (2017). https://doi.org/10.24963/ijcai.2017/568

  11. Ma, Y., Peng, H., Cambria, E.: Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive LSTM. In: AAAI (2018). https://doi.org/10.1007/s12559-018-9549-x

    Article  Google Scholar 

  12. Mubarok, M.S., Adiwijaya, Aldhi, M.D.: Aspect-based sentiment analysis to review products using Naïve Bayes. In: AIPC, vol. 1867, p. 020060. AIP Publishing (2017). https://doi.org/10.1063/1.4994463

  13. Pang, B., Lee, L., et al.: Opinion mining and sentiment analysis. FTIR 2(1–2), 1–135 (2008). https://doi.org/10.4000/books.oep.223

    Article  Google Scholar 

  14. Pennington, J., Socher, R., Manning, C.: GloVe: global vectors for word representation. In: EMNLP, pp. 1532–1543 (2014). https://doi.org/10.3115/v1/d14-1162

  15. Qian, Q., Huang, M., Lei, J., Zhu, X.: Linguistically regularized LSTMs for sentiment classification. arXiv preprint arXiv:1611.03949 (2016). https://doi.org/10.18653/v1/p17-1154

  16. Shen, H.: Towards a mathematical understanding of the difficulty in learning with feedforward neural networks. In: CVPR, pp. 811–820 (2018). https://doi.org/10.1109/cvpr.2018.00091

  17. Singla, Z., Randhawa, S., Jain, S.: Sentiment analysis of customer product reviews using machine learning. In: 2017 International Conference on Intelligent Computing and Control (I2C2), pp. 1–5. IEEE (2017). https://doi.org/10.1109/i2c2.2017.8321910

  18. Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. arXiv preprint arXiv:1503.00075 (2015). https://doi.org/10.3115/v1/p15-1150

  19. Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. arXiv preprint arXiv:1605.08900 (2016). https://doi.org/10.18653/v1/d16-1021

  20. Übeyli, E.D., Übeyli, M.: Case studies for applications of Elman recurrent neural networks (2008). https://doi.org/10.5772/5550

    Google Scholar 

  21. Wald, R., Khoshgoftaar, T.M., Napolitano, A.: Stability of filter-and wrapper-based feature subset selection. In: ICTAI, pp. 374–380. IEEE (2013). https://doi.org/10.1109/ictai.2013.63

  22. Wang, H., Zhang, X., Liang, B., Zhou, Q., Xu, B.: Gated hierarchical LSTMs for target-based sentiment analysis. Int. J. Softw. Eng. Knowl. Eng. 28(11n12), 1719–1737 (2018). https://doi.org/10.18293/seke2018-093

  23. Wang, Y., Huang, M., Zhao, L., et al.: Attention-based LSTM for aspect-level sentiment classification. In: EMNLP, pp. 606–615 (2016). https://doi.org/10.18653/v1/d16-1058

  24. Yang, T.H., Tseng, T.H., Chen, C.P.: Recurrent neural network-based language models with variation in net topology, language, and granularity. In: IALP, pp. 71–74. IEEE (2016). https://doi.org/10.1109/ialp.2016.7875937

Download references

Acknowledgments

This work is supported by the Humanities and Social Sciences Fund of Ministry of Education (13YJC870023) and the National Social Science Fund of China (15BTQ056).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pengfei Jiao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sun, Y., Wang, X., Liu, H., Wang, W., Jiao, P. (2019). Surrounding-Based Attention Networks for Aspect-Level Sentiment Classification. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Text and Time Series. ICANN 2019. Lecture Notes in Computer Science(), vol 11730. Springer, Cham. https://doi.org/10.1007/978-3-030-30490-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30490-4_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30489-8

  • Online ISBN: 978-3-030-30490-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics