skip to main content
10.1145/3704323.3704338acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiccprConference Proceedingsconference-collections
research-article

Aspect-level Sentiment Analysis based on XLNET-BiLSTM-ATT

Published: 07 January 2025 Publication History

Abstract

In the research of aspect-level sentiment analysis, traditional neural networks mainly use attention mechanisms to focus on the weight information of words, the grammatical structure of sentences and the dependency information of aspect words as well as part-of-speech information are often ignored in the feature extraction task, and most of the existing mainstream schemes use models such as Word2Vec and GloVe to complete the word embedding work, and the static word vectors generated by these models can only capture the semantic information of words, but cannot reflect the contextual relationship of words. In this paper, we propose an aspect-level sentiment analysis framework based on XLNet-BiLSTM-ATT. The framework uses XLNet to learn the global word contextual representation for word embedding, and adopts an extra-long version of the Transformer feature extractor, which is able to better capture global dependencies; BiLSTM is used to encode the information after word embedding to get the hidden vectors of words; the attention scores of the part-of-speech and dependency labels of the data are computed at the attention layer separately to obtain the feature representation; finally, these two features are fused and the classification output is performed by Softmax layer. Experimental results on three publicly available datasets show that the framework shows significant improvement in both accuracy and F1 value.

References

[1]
Tomas Mikolov. 2013. Efficient estimation of word representations in vector space. arXiv preprint arXiv:https://arXiv.org/abs/1301.3781 3781 (2013).
[2]
Jeffrey Pennington, Richard Socher, and Christopher D Manning. 2014. Glove: Global vectors for word representation. In Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). 1532–1543.
[3]
Yu Sun, Shuohuan Wang, Yukun Li, Shikun Feng, Xuyi Chen, Han Zhang, Xin Tian, Danxiang Zhu, Hao Tian, and Hua Wu. 2019. Ernie: Enhanced representation through knowledge integration. arXiv preprint arXiv:https://arXiv.org/abs/1904.09223 (2019).
[4]
Zhilin Yang. 2019. XLNet: Generalized Autoregressive Pretraining for Language Understanding. arXiv preprint arXiv:https://arXiv.org/abs/1906.08237 (2019).
[5]
A Rakhlin. 2016. Convolutional neural networks for sentence classification. GitHub 6 (2016), 25.
[6]
Liu Longfei, Yang Liang, Zhang Shaowu, et al. 2015. Analysis of Weibo emotional tendency based on convolutional neural network [J]. Journal of Chinese Information Processing 29, 6 (2015), 159–165.
[7]
Dehong Ma, Sujian Li, Xiaodong Zhang, and Houfeng Wang. 2017. Interactive attention networks for aspect-level sentiment classification. arXiv preprint arXiv:https://arXiv.org/abs/1709.00893 (2017).
[8]
Duyu Tang, Bing Qin, Xiaocheng Feng, and Ting Liu. 2015. Effective LSTMs for target-dependent sentiment classification. arXiv preprint arXiv:https://arXiv.org/abs/1512.01100 (2015).
[9]
Binxuan Huang, Yanglan Ou, and Kathleen M Carley. 2018. Aspect level sentiment classification with attention-over-attention neural networks. In Social, Cultural, and Behavioral Modeling: 11th International Conference, SBP-BRiMS 2018, Washington, DC, USA, July 10-13, 2018, Proceedings 11. Springer, 197–206.
[10]
Yequan Wang, Minlie Huang, Xiaoyan Zhu, and Li Zhao. 2016. Attention-based LSTM for aspect-level sentiment classification. In Proceedings of the 2016 conference on empirical methods in natural language processing. 606–615.
[11]
Duyu Tang, Bing Qin, and Ting Liu. 2016. Aspect level sentiment classification with deep memory network. arXiv preprint arXiv:https://arXiv.org/abs/1605.08900 (2016).
[12]
Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, and Yoshua Bengio. 2017. Graph attention networks. arXiv preprint arXiv:https://arXiv.org/abs/1710.10903 (2017).
[13]
Kai Wang, Weizhou Shen, Yunyi Yang, Xiaojun Quan, and Rui Wang. 2020. Relational graph attention network for aspect-based sentiment analysis. arXiv preprint arXiv:https://arXiv.org/abs/2004.12362 (2020).
[14]
T Dozat. 2016. Deep Biaffine Attention for Neural Dependency Parsing. arXiv preprint arXiv:https://arXiv.org/abs/1611.01734 (2016).
[15]
Chen Zhang, Qiuchi Li, and Dawei Song. 2019. Aspect-based sentiment classification with aspect-specific graph convolutional networks. arXiv preprint arXiv:https://arXiv.org/abs/1909.03477 (2019).

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICCPR '24: Proceedings of the 2024 13th International Conference on Computing and Pattern Recognition
October 2024
448 pages
ISBN:9798400717482
DOI:10.1145/3704323
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 January 2025

Check for updates

Author Tags

  1. Aspect-level sentiment analysis
  2. word embedding
  3. XLNet
  4. attention mechanisms

Qualifiers

  • Research-article

Conference

ICCPR 2024

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 14
    Total Downloads
  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)9
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media