ABSTRACT
Extracting and representing text features is the most important part of text classification. Aiming at the problem of incomplete feature extraction in traditional text classification methods, a text classification model based on graph convolution neural network and attention mechanism is proposed. Firstly, the text is input into BERT (Bi-directional Encoder Representations from Transformers) model to obtain the word vector representation, the context semantic information of the given text is learned by the BiGRU (Bi-directional Gated Recurrent Unit), and the important information is screened by attention mechanism and used as node features. Secondly, the dependency syntax diagram and the corresponding adjacency matrix of the input text are constructed. Thirdly, the GCN (Graph Convolution Neural Network) is used to learn the node features and adjacency matrix. Finally, the obtained text features are input into the classifier for text classification. Experiments on two datasets show that the proposed model achieves a good classification effect, and better accuracy is achieved in comparison with baseline models.
- Kim Y. Convolutional neural networks for sentence classification[C]//In Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2014:1746-1751.Google Scholar
- Gao Z, Feng A, Song X, Target-dependent sentiment classification with BERT[J]. IEEE Access, 2019, 7:1-1.Google Scholar
- Reimers N, Gurevych I. Alternative weighting schemes for ELMo embeddings[J]. arXiv preprint arXiv:1904.02954, 2019.Google Scholar
- Onan A. Sentiment analysis on product reviews based on weighted word embeddings and deep neural networks[J]. Concurrency and Computation:Practice and Experience, 2020, 33(5).Google Scholar
- Wang Y, Huang M, Zhu X, Attention-based LSTM for aspect-level sentiment classification[C]//In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, 2016.Google Scholar
- Li X, Bing L, Lam W, Transformation networks for target-oriented sentiment classification[C]//In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2018.Google Scholar
- Zhou J, Cui G, Hu S, Graph neural networks:A review of methods and application[J]. AI Open, 2020, 1:57-81.Google ScholarCross Ref
- Scarselli F, Gori M, Tsoi A C, The graph neural network model[J]. IEEE transactions on neural networks, 2008, 20(1),61-80.Google Scholar
- Wang H T, Song W, Wang H. Text Classification method based on hybrid model of LSTM and CNN[J]. Journal of Chinese Computer Systems, 2020, 41(06):1163-1168.Google Scholar
- Wang W, Sun Y X, Qi Q J, Text sentiment classification model based on BiGRU-attention neural network[J]. Application Research of Computers, 2019, 36(12): 3558-3564.Google Scholar
- Hassan A, Mahmood A. Convolutional recurrent deep learning model for sentence classification[J]. IEEE Access, 2018, 6(1): 13949-13957.Google ScholarCross Ref
- Li K Y, Chen Y, Niu S Z. Social e-commerce text classification algorithm based on BERT[J]. Computer Science, 2021, 48(2):87-92.Google Scholar
- Rasmy L, Xiang Y, Xie Z, Med-BERT:pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction[J]. NPJ Digital Medicine, 2021, 4(1):1-13.Google ScholarCross Ref
- Weng X F, Zhao J H, Jiang C X, Research on sentiment classification of futures predictive texts based on BERT[J/OL]. Computing, 2021. https://doi.org/10.1007/s00607-021-00989-9.Google ScholarCross Ref
- Chen Z Q, Ju T. Research on tendency analysis of microblog comments based on BERT and BLSTM[J]. Information Studuies:Theory&Application, 2020, 43(8):173-177.Google Scholar
- Li W, Qi F, Tang M, Bidirectional LSTM with self-attention mechanism and multi-channel features for sentiment classification[J]. Neurocomputing, 2020, 387: 63-77.Google ScholarDigital Library
- Liu J, Yang H, Lv S Q, Attention-based BiGRU-CNN for chinese question classification[J]. Journal of Ambient Intelligence and Humanized Computing, 2019(2).Google Scholar
- Xu B B, Cen K T, Huang J J, A survey on graph convolutional neural network[J]. Chinese Journal of Computers, 2020, 43(5): 755-780.Google Scholar
- Fang A. Short-text question classification based on dependency parsing and attention mechanism[C]//In Proceedings of the 2019 International Conference on Machine Learning and Cybernetics (ICMLC), 2019.Google Scholar
- Luo H, Li T, Liu B, Improving aspect term extraction with bidirectional dependency tree representation[J]. IEEE Trans Audio Speech Lang Process, 2019, 27(7):1201-1212.Google ScholarDigital Library
- Sun K, Zhang R C. Aspect-level sentiment analysis via convolution over dependency tree[C]//In Proceedings of Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019.Google Scholar
- Wang K, Shen W, Yang Y, Relational graph attention network for aspect-based sentiment analysis[C]//In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020.Google Scholar
- Yao L, Mao C, Luo Y. Graph convolutional networks for text classification[C]//In proceedings of the AAAI conference on artificial intelligence, 2018.Google Scholar
- Vaswani A, Shazeer N, Parmar N, Attention is all you need[C]//In proceedings of the 31st Conference on Neural Information Processing Systems, 2017.Google Scholar
- Zhu Y, Chen S P. Commentary text sentiment analysis combining convolution neural network and attention[J]. Journal of Chinese Computer Systems, 2020, 41(03):551-557.Google Scholar
Index Terms
- Text Classification Based on Graph Convolution Neural Network and Attention Mechanism
Recommendations
Attention Based Stopword Generation for Neural Network Based Text Processing
Information Integration and Web IntelligenceAbstractStopwords are used to improve the accuracy of document classification and retrieval. We believe that setting appropriate stopwords improves classification accuracy. However, in our preliminary experiments, in document classification tasks using ...
Two-channel hierarchical attention mechanism model for short text classification
AbstractText classification plays an important role in information science. In order to address the issues of low classification efficiency, low accuracy, and incomplete text feature extraction in existing classification methods, this work offers a two-...
Chinese text classification based on neural network
ISNN'13: Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part IText classification is widely used nowadays. In this paper, we proposed a combination feature reduction method to reduce feature space dimension based on inductive analysis of existing researches. Neural network was then trained and used to classify new ...
Comments