skip to main content
research-article

A Context-focused Attention Evolution Model for Aspect-based Sentiment Classification

Published: 09 May 2023 Publication History

Abstract

Due to their inherent capability in the semantic alignment of aspects and their context words, Attention and Long-Short-Term-Memory (LSTM) mechanisms are widely adopted for Aspect-Based Sentiment Classification (ABSC) tasks. Instead, it is challenging to handle long-range word dependencies on multiple entities due to the deficiency in attention mechanisms. To solve this problem, we propose a Context-Focused Aspect-Based Network to align attention before LSTM, making the model focus more on aspect-related words and ignore irrelevant words, improving the accuracy of final classification. This can either alleviate attention distraction or reinforce the text representation ability. Experiments on two benchmark datasets show that the results achieve respectable performance compared to the state-of-the-art methods available in ABSC. Our approach has the potential to improve classification accuracy by adaptively adjusting the focus on context.

References

[1]
A. Yadav and D. K. Vishwakarma. 2020. Sentiment analysis using deep learning architectures: A review. Artif. Intell. Rev. 53, 6 (2020), 4335–4385.
[2]
N. C. Dang, M. N. Moreno-García, and F. De la Prieta. 2020. Sentiment analysis based on deep learning: A comparative study. Electronics 9, 3 (2020), 483.
[3]
F. Hemmatian and M. K. Sohrabi. 2019. A survey on classification techniques for opinion mining and sentiment analysis. Artif. Intell. Rev. 52, 3 (2019), 1495–1545.
[4]
Bing Liu. 2012. Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol. 5, 1 (2012), 1–167.
[5]
Bo Pang and Lillian Lee. 2007. Opinion mining and sentiment analysis. Found. Trends Inf. Retriev. 2, 1–2 (2007), 1–135.
[6]
A. Chugh, V. K. Sharma, S. Kumar, A. Nayyar, B. Qureshi, M. K. Bhatia, and C. Jain. 2021. Spider monkey crow optimization algorithm with deep learning for sentiment classification and information retrieval. IEEE Access 9 (2021), 24249–24262.
[7]
Y. Wu, W. Zhang, and S. Wan. 2022. CE-text: A context-Aware and embedded text detector in natural scene images. Pattern Recogn. Lett. 159 (2022), 77–83.
[8]
Yirui Wu, Wenxiang Liu, and Shaohua Wan. 2021. Multiple attention encoded cascade R-CNN for scene text detection. J. Vis. Commun. Image Represent. 80 (2021), 103261.
[9]
S. Ding, S. Qu, Y. Xi, and S. Wan. 2020. Stimulus-driven and concept-driven analysis for image caption generation. Neurocomputing 398 (2020), 520–530.
[10]
D. Tang, B. Qin, and T. Liu. 2016. Aspect level sentiment classification with deep memory network. ACM Transactions on Intelligent Systems and Technology 9, 2 (2016), 1--19.
[11]
A. Abdi, S. M. Shamsuddin, S. Hasan, and J. Piran. 2019. Deep learning-based sentiment classification of evaluative text based on Multi-feature fusion. Inf. Process. Manage. 56, 4 (2019), 1245–1259.
[12]
J. Wang, B. Xu, and Y. Zu. 2021. Deep learning for aspect-based sentiment analysis. In Proceedings of the International Conference on Machine Learning and Intelligent Systems Engineering (MLISE’21). IEEE. 267–271.
[13]
D. Ma, S. Li, X. Zhang, and H. Wang. 2017. Interactive attention networks for aspect-level sentiment classification. arXiv preprint arXiv:1709.00893. (2017).
[14]
M. U. Salur and I. Aydin. 2020. A novel hybrid deep learning model for sentiment classification. IEEE Access 8 (2020), 58080–58093.
[15]
Xin Li, Lidong Bing, Wai Lam, and Bei Shi. 2018. Transformation networks for target-oriented sentiment classification. arXiv:1805.01086. Retrieved from https://arxiv.org/abs/1805.01086.
[16]
J. Devlin, M. W. Chang, K. Lee, and K. Toutanova. 2018. BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805. Retrieved from https://arxiv.org/abs/1810.04805.
[17]
S. Hochreiter and J. Schmidhuber. 1997. Long short-term memory. Neural Comput. 9, 8 (1997), 1735–1780.
[18]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Adv. Neural Inf. Process. Syst. 30 (2017), 5998–6008.
[19]
A. Graves and J. Schmidhuber. 2005. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 18, 5-6 (2005), 602–610.
[20]
L. Fang and M. Huang. 2012. Fine granular aspect analysis using latent structural models. In Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Short Papers-Volume 2. Association for Computational Linguistics, New York, NY, 333–337.
[21]
W. Jin, H. H. Ho, and R. K. Srihari. 2009. OpinionMiner: A novel machine learning system for web opinion mining and extraction. In Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 1195–1204.
[22]
L. Jiang, M. Yu, M. Zhou, X. Liu, and T. Zhao. 2011. Target-dependent twitter sentiment classification. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies-Volume 1. Association for Computational Linguistics, New York, NY, 151–160.
[23]
S. Kiritchenko, X. Zhu, C. Cherry, and S. Mohammad. 2014. NRC-Canada-2014: Detecting aspects and sentiment in customer reviews. In Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval’14). 437–442.
[24]
L. Dong, F. Wei, C. Tan, D. Tang, M. Zhou, and K. Xu. 2014. Adaptive recursive neural network for target-dependent twitter sentiment classification. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics: Short Papers, Volume 2. 49–54.
[25]
D. Tang, B. Qin, X. Feng, and T. Liu. 2016. Effective LSTMs for target-dependent sentiment classification. In Proceedings of the 26th International Conference on Computational Linguistics: Technical Papers (COLING’16). 3298–3307.
[26]
Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2014. Neural machine translation by jointly learning to align and translate. arXiv:1409.0473. Retrieved from https://arxiv.org/abs/1409.0473.
[27]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Proceedings of the Conference and Workshop on Neural Information Processing Systems (NIPS’17). 6000–6010.
[28]
Duyu Tang, Bing Qin, and Ting Liu. 2015. Document modeling with gated recurrent neural network for sentiment classification. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’15). 1422–1432.
[29]
Yequan Wang, Minlie Huang, Li Zhao, and Xiaoyan Zhu. 2016. Attention-based LSTM for aspect-level sentiment classification. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’16). 606–615.
[30]
Alexander M. Rush, Sumit Chopra, and Jason Weston. 2015. A neural attention model for abstractive sentence summarization. arXiv:1509.00685. Retrieved from https://arxiv.org/abs/1509.00685.
[31]
P. Chen, Z. Sun, L. Bing, and W. Yang. 2017. Recurrent attention network on memory for aspect sentiment analysis. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’17). 452–461.
[32]
J. Wang, J. Li, S. Li, Y. Kang, M. Zhang, L. Si, and G. Zhou. 2018. Aspect sentiment classification with both word-level and clause-level attention networks. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI’18). 4439–4445.
[33]
Z. Yang, D. Yang, C. Dyer, X. He, A. Smola, and E. Hovy. 2016. Hierarchical attention networks for document classification. In Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 1480–1489.
[34]
B. Huang, Y. Ou, and K. M. Carley. 2018. Aspect level sentiment classification with attention-over-attention neural networks. In International Conference on Social Computing, Behavioral-Cultural Modeling and Prediction and Behavior Representation in Modeling and Simulation. Springer, Cham. 197–206.
[35]
S. Gu, L. Zhang, Y. Hou, and Y. Song. 2018. A position-aware bidirectional attention network for aspect-level sentiment analysis. In Proceedings of the 27th International Conference on Computational Linguistics. 774–784.
[36]
J. Chung, C. Gulcehre, K. Cho, and Y. Bengio. 2015. Gated feedback recurrent neural networks. In Proceedings of the International Conference on Machine Learning. 2067–2075.
[37]
Q. Liu, H. Zhang, Y. Zeng, Z. Huang, and Z. Wu. 2018. Content attention model for aspect based sentiment analysis. In Proceedings of the World Wide Web Conference. 1023–1032.
[38]
F. Fan, Y. Feng, and D. Zhao. 2018. Multi-grained attention network for aspect-level sentiment classification. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’18). 3433–3442.
[39]
Q. Xu, L. Zhu, T. Dai et al. 2020. Aspect-based sentiment classification with multi-attention network. Neurocomputing 388, (2020), 135–143.
[40]
Q. Jiang, L. Chen, R. Xu, X. Ao, and M. Yang. 2019. A challenge dataset and effective models for aspect-based sentiment analysis. In Proceedings of the Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP’19). 6281–6286.
[41]
X. Glorot and Y. Bengio. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the 13th International Conference on Artificial Intelligence and Statistics. 249–256.
[42]
J. Pennington, R. Socher, and C. D. Manning. 2014. Glove: Global vectors for word representation. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’14). 1532–1543.
[43]
D. P. Kingma and J. Ba. 2014. Adam: A method for stochastic optimization. arXiv:1412.6980. Retrieved from https://arxiv.org/abs/1412.6980.

Cited By

View all
  • (2023)Text Classification Using Improving Term Frequency with Gaussian and Multilayer Perceptron Techniques2023 20th International Joint Conference on Computer Science and Software Engineering (JCSSE)10.1109/JCSSE58229.2023.10202049(13-18)Online publication date: 28-Jun-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Asian and Low-Resource Language Information Processing
ACM Transactions on Asian and Low-Resource Language Information Processing  Volume 22, Issue 5
May 2023
653 pages
ISSN:2375-4699
EISSN:2375-4702
DOI:10.1145/3596451
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 May 2023
Online AM: 20 March 2023
Accepted: 01 March 2023
Revised: 27 February 2023
Received: 07 May 2022
Published in TALLIP Volume 22, Issue 5

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Aspect-Based Sentiment Classification
  2. masked language model
  3. Bert

Qualifiers

  • Research-article

Funding Sources

  • National Natural Science Foundation of China Project
  • Local science and technology developing foundation guided by the central government
  • Opening Project of State Key Laboratory of Nickel and Cobalt Resources Comprehensive Utilization
  • Fundamental Research Funds for the Central Universities of Central South University

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)44
  • Downloads (Last 6 weeks)1
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Text Classification Using Improving Term Frequency with Gaussian and Multilayer Perceptron Techniques2023 20th International Joint Conference on Computer Science and Software Engineering (JCSSE)10.1109/JCSSE58229.2023.10202049(13-18)Online publication date: 28-Jun-2023

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media