Skip to main content

A Deep Learning Model Based on Neural Bag-of-Words Attention for Sentiment Analysis

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 12815))

  • 2392 Accesses

Abstract

In the field of Natural Language Processing, sentiment analysis is one of core research directions. The hot issue of sentiment analysis is how to avoid the shortcoming of using fixed vector to calculate attention distribution. In this paper, we proposed a novel sentiment analysis model based on neural bag-of-words attention, which utilizes Bidirectional Long Short-Term Memory (BiLSTM) to capture the deep semantic features of text, and fusion these features by attention distribution based on neural bag-of-words. The experimental results show that the proposed method has improved 2.53%–6.46% accuracy compared with the benchmark.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://spacy.io/.

  2. 2.

    https://pytorch.org/.

  3. 3.

    http://nlp.stanford.edu/data/glove.6B.zip.

References

  1. Pang, B., Lee, L.: Opinion mining and sentiment analysis. Found. Trends Inf. Retr. 2, 1–135 (2008)

    Article  Google Scholar 

  2. Kim, Y.: Convolutional neural networks for sentence classification. In: EMNLP, pp. 1746–1751 (2014)

    Google Scholar 

  3. Conneau, A., Schwenk, H., Yann, L.C.: Very deep convolutional networks for text classification. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, pp. 1107–1116 (2017)

    Google Scholar 

  4. Tang, D., Qin, B., Liu, T.: Document modeling with gated recurrent neural network for sentiment classification. In: EMNLP, pp. 1422–1432 (2015)

    Google Scholar 

  5. Sharfuddin, A.A., Tihami, M., Islam, M.: A deep recurrent neural network with BiLSTM model for sentiment classification. In: Proceedings of the 2018 International Conference on Bangla Speech and Language Processing, pp. 1–4 (2018)

    Google Scholar 

  6. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of the 3rd International Conference on Learning Representations (2015)

    Google Scholar 

  7. Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: HLT-NAACL, pp. 1480–1489 (2016)

    Google Scholar 

  8. Lin, Z., et al.: A structured self-attentive sentence embedding. arXiv preprint arXiv:1703.03130 (2017)

  9. Schoene, A., Alexander, P., Dethlefs, N.: Bidirectional dilated LSTM with attention for fine-grained emotion classification in tweets. In: Proceedings of the 3rd Workshop on Affective Content Analysis, pp. 100–117 (2020)

    Google Scholar 

  10. Lyyer, M., Manjunatha, V., Jordan, B., Daume, H.: Deep unordered composition rivals syntactic methods for text classification. In: ACL, pp. 1681–1691 (2015)

    Google Scholar 

  11. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  12. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: EMNLP, pp. 1724–1734 (2014)

    Google Scholar 

  13. Xu, G., Meng, Y., Qiu, X., Yu, Z., Wu, X.: Sentiment analysis of comment texts based on BiLSTM. IEEE Access 7, 51522–51532 (2019)

    Article  Google Scholar 

  14. Kowsari, K., Brown, D., Heidarysafa, M., Meimandi, K., Gerber, M., Barnes, L.: HDLTex: Hierarchical deep learning for text classification. In: Proceedings of the 16th IEEE International Conference on Machine Learning and Applications, pp. 364–371 (2017)

    Google Scholar 

  15. Zhao, H., Wang, L., Wang, W.: Text sentiment analysis based on serial hybrid model of bi-directional long short-term memory and convolutional neural network. J. Comput. Appl. 40(1), 16–22 (2020)

    Google Scholar 

  16. Zeng, B., Han, X., Wang, S., Zhou, W., Yang, H.: Hierarchical double-attention neural networks for sentiment classification. CAAI Trans. Intell. Syst. 15(3), 460–467 (2020)

    Google Scholar 

  17. Chen, H., Gao, B., Chen, L., Yu, C.: Sentiment classification model combining attention mechanism and bidirectional slice GRU. J. Chinese Comput. Syst. 41(9), 1793–1799 (2020)

    Google Scholar 

  18. Kingma, D., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  19. Pennington, J., Socher, R., Manning, C.D.: GloVe: Global vectors for word representation. In: EMNLP, pp. 1532–1543 (2014)

    Google Scholar 

  20. Maas, A., Daly, R., Pham, P., Huang, D., Andrew, Y., Potts, C.: Learning word vectors for sentiment analysis. In: ACL, pp. 142–150 (2011)

    Google Scholar 

Download references

Acknowledgement

This research is partially supported by the Outstanding Youth Project of Hunan Provincial Education department (No.18B228).

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liao, J., Yi, Z. (2021). A Deep Learning Model Based on Neural Bag-of-Words Attention for Sentiment Analysis. In: Qiu, H., Zhang, C., Fei, Z., Qiu, M., Kung, SY. (eds) Knowledge Science, Engineering and Management. KSEM 2021. Lecture Notes in Computer Science(), vol 12815. Springer, Cham. https://doi.org/10.1007/978-3-030-82136-4_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-82136-4_38

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-82135-7

  • Online ISBN: 978-3-030-82136-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics