ABSTRACT
The purpose of Aspect-level Sentiment Classification (ASC) is to judge the sentiment polarity of words or phrases with subjective emotions in a sentence. The existing mainstram method is to extract the context semantics of aspect targets words based on attention mechanism, but the attention mechanism is not sensitive to location information and not good at capture syntactic dependency information. To solve the existing problems, this paper proposes an Aspect-Specific Graph Attention Network (ASGAT) model to make up for the shortcoming of attention mechanism. Firstly, the dependency syntax tree of a sentence is obtained by dependency parsing, and then the adjacency matrix with local location information is generated by combining the local location relationship of aspect target words in the context. Then the context semantics of aspect target words is extracted by multi-layer Graph Attention Network (GAT), which makes the semantic features of extracted aspect targets more abundant. The model can capture the long-distance syntactic dependency and the local information of aspect words better. The proposed model has achieved better results on five benchmark datasets. The effectiveness of the model is verified by model comparision and experimental analysis.
- Liu B. Sentiment analysis and opinion mining. Synthesis lectures on human language technologies, 2012, 5(1): 1-167.Google ScholarDigital Library
- Wang S, Liu B. Deep learning for sentiment analysis: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2018, 8(4): e1253.Google Scholar
- Zhou J, Huang J X, Chen Q, Deep learning for aspect-level sentiment classification: Survey, vision, and challenges. IEEE access, 2019, 7: 78454-78483.Google Scholar
- Kim Y. Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882, 2014.Google Scholar
- TANG D, QIN B, LIU T. Document modeling with gated recurrent neural network for sentiment classification. Proceedings of the 2015 conference on empirical methods in natural language processing. 2015:1422-1432.Google Scholar
- Hochreiter S, Schmidhuber J. Long short-term memory. Neural computation, 1997, 9(8): 1735-1780.Google Scholar
- Tang D, Qin B, Feng X, Effective LSTMs for Target-Dependent Sentiment Classification. Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers.2016:3298-3307.Google Scholar
- Ma D, Li S, Zhang X, Interactive attention networks for aspect-level sentiment classification. Proceedings of the 26th International Joint Conference on Artificial Intelligence. 2017: 4068-4074.Google Scholar
- Zhang C, Li Q, Song D. Aspect-based Sentiment Classification with Aspect-specific Graph Convolutional Networks. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019: 4560-4570.Google Scholar
- Zhang J, Chen C, Liu P, Target-Guided Structured Attention Network for Target-Dependent Sentiment Analysis. Transactions of the Association for Computational Linguistics, 2020, 8: 172-182.Google Scholar
- Veličković P, Cucurull G, Casanova A, Graph Attention Networks. International Conference on Learning Representations. 2018.Google Scholar
- Dong L, Wei F, Tan C, Adaptive recursive neural network for target-dependent twitter sentiment classification. Proceedings of the 52nd annual meeting of the association for computational linguistics (volume 2: Short papers). 2014: 49-54.Google Scholar
- Pontiki M, Papageorgiou H, Galanis D, SemEval-2014 Task 4: Aspect Based Sentiment Analysis. SemEval 2014, 2014: 27.Google Scholar
- Pontiki M, Galanis D, Papageorgiou H, Semeval-2015 task 12: Aspect based sentiment analysis. Proceedings of the 9th international workshop on semantic evaluation (SemEval 2015). 2015: 486-495.Google ScholarCross Ref
- Pontiki M, Galanis D, Papageorgiou H, SemEval-2016 Task 5: Aspect Based Sentiment Analysis. Proceedings of SemEval, 2016: 19-30.Google Scholar
- Chen P, Sun Z, Bing L, Recurrent attention network on memory for aspect sentiment analysis. Proceedings of the 2017 conference on empirical methods in natural language processing. 2017: 452-461.Google Scholar
- Huang B, Ou Y, Carley K M. Aspect level sentiment classification with attention-over-attention neural networks. International Conference on Social Computing, Behavioral-Cultural Modeling and Prediction and Behavior Representation in Modeling and Simulation. Springer, Cham, 2018: 197-206.Google ScholarCross Ref
- Song Y, Wang J, Jiang T, Attentional encoder network for targeted sentiment classification. arXiv preprint arXiv:1902.09314, 2019.Google Scholar
- Huang B, Carley K M. Syntax-Aware Aspect Level Sentiment Classification with Graph Attention Networks. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019: 5472-5480.Google Scholar
Recommendations
Aspect-level Sentiment Classification with HEAT (HiErarchical ATtention) Network
CIKM '17: Proceedings of the 2017 ACM on Conference on Information and Knowledge ManagementAspect-level sentiment classification is a fine-grained sentiment analysis task, which aims to predict the sentiment of a text in different aspects. One key point of this task is to allocate the appropriate sentiment words for the given aspect.Recent ...
Dependency Parsing and Attention Network for Aspect-Level Sentiment Classification
Natural Language Processing and Chinese ComputingAbstractAspect-level sentiment classification aims to determine the sentiment polarity of the sentence towards the aspect. The key element of this task is to characterize the relationship between the aspect and the contexts. Some recent attention-based ...
Aspect-Level Sentiment Classification with Dependency Rules and Dual Attention
Neural Information ProcessingAbstractAspect-level sentiment classification aims to predict the sentiment polarity towards the given aspects of sentences. Neural network models with attention mechanism have achieved great success in this area. However, existing methods fail to capture ...
Comments