Skip to main content
Log in

Social Media Sentiment Analysis Based on Dependency Graph and Co-occurrence Graph

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

In recent years, research of social text sentiment analysis has progressed rapidly, but the existing methods usually use the single feature for text representation and fail to make full use of potential features in social texts. Such sparse feature limits the improvement of sentiment analysis performance. Intuitively, besides the plain text feature, the features that reveal grammatical rules and semantic associations all have a positive effect on the performance of sentiment analysis. This article takes diverse structural information, part of speech, and position association information into consideration simultaneously, and proposes a brain-inspired multi-feature hierarchical graph attention model (MH-GAT) based on co-occurrence and syntactic dependency graphs for sentiment analysis. It mainly includes multi-feature fusion and bi-graph hierarchical attention. Specifically, we first design an input layer involving multiple features, such as part of speech, position, syntactic dependency, and co-occurrence information, to make up for the information lacking in conventional sentiment analysis methods. As for the bi-graph hierarchical attention mechanism, we build hierarchical graphs for each text and use a graph attention network with extraordinary aggregation ability to learn the inherent rules of language expression. Compared to the latest Att-BLSTM, Text-Level-GNN, and TextING, the sentiment analysis accuracy of the proposed model has increased by an average of 5.17% on the Chinese Weibo dataset and English SST2 dataset. The proposed MH-GAT model can effectively improve the classification performance of social short texts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. https://www.weibo.com

  2. https://www.bilibili.com

  3. https://gluebenchmark.com

  4. https://www.douban.com

  5. http://IMDb.com

References

  1. Zhang L, Liu B. Sentiment analysis and opinion mining. In: Sammut C, Webb GI, editors. Encyclopedia of Machine Learning and Data Mining. Boston: Springer; 2017. p. 1152–61. https://doi.org/10.1007/978-1-4899-7687-1_907.

    Chapter  Google Scholar 

  2. Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space. arXiv preprint; 2013. https://arxiv.org/abs/1301.3781.

  3. Pennington J, Socher R, Manning CD. Glove: global vectors for word representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing. Association for Computational Linguistics; 2014. p. 1532–43. https://doi.org/10.3115/v1/D14-1162.

  4. Felbo B, Mislove A, Søgaard A, Rahwan I, Lehmann S. Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm. In: 2017 Conference on Empirical Methods in Natural Language Processing. EMNLP; 2017. p. 1615–25.

  5. Chen Y, Yuan J, You Q, Luo J. Twitter sentiment analysis via bi-sense emoji embedding and attention-based LSTM. In: Proceedings of the 26th ACM international conference on Multimedia. New York: Association for Computing Machinery; 2018. p. 117–25. https://doi.org/10.1145/3240508.3240533.

  6. Tao Y, Zhang X, Shi L, Wei L, Hai Z, Wahid J A. Joint embedding of emoticons and labels based on CNN for microblog sentiment analysis. In: Proceedings of the Fourth IEEE International Conference on Data Science in Cyberspace. 2019. p. 168–75. https://doi.org/10.1109/DSC.2019.00033.

  7. Fedorenko E, Blank IA, Siegelman M, Mineroff Z. Lack of selectivity for syntax relative to word meanings throughout the language network. Cognition. 2020;203:104348.

    Article  Google Scholar 

  8. Zhang M, Li Z, Fu G, Zhang M. Dependency-based syntax-aware word representations. Artif Intell. 2021;292:103427.

    Article  MathSciNet  Google Scholar 

  9. Zeng J, Liu T, Jia W, Zhou J. Fine-grained question-answer sentiment classification with hierarchical graph attention network. Neurocomputing. 2021;457:214–24.

    Article  Google Scholar 

  10. Cambria E. Affective computing and sentiment analysis. IEEE Intell Syst. 2016;30(2):102–7.

    Article  Google Scholar 

  11. Nielsen FÅ. A new ANEW: evaluation of a word list for sentiment analysis in microblogs. In: Proceedings of the ESWC2011 Workshop on ‘Making Sense of Microposts’: big things come in small packages, vol. 718. 2011. p. 93–8.

  12. Oraby S, El-Sonbaty Y, El-Nasr MA. Finding opinion strength using rule-based parsing for Arabic sentiment analysis. In: Castro F, Gelbukh A, González M, editors. MICAI 2013: Advances in Soft Computing and Its Applications, vol. 8266. 2013. p. 509–20. https://doi.org/10.1007/978-3-642-45111-9_44.

  13. Cambria E. An introduction to concept-level sentiment analysis. In: Castro F, Gelbukh A, González M, editors. MICAI 2013: Advances in Soft Computing and Its Applications, vol. 8266. Berlin: Springer; 2013. p. 478–83. https://doi.org/10.1007/978-3-642-45111-9_41.

  14. Li M, Ch’ng E, Chong A, See S. Multi-class Twitter sentiment classification with emojis. Ind Manag Data Syst. 2018;118(9):1804–20. https://doi.org/10.1108/IMDS-12-2017-0582.

    Article  Google Scholar 

  15. Kim Y. Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics; 2014. p. 1746–51. https://doi.org/10.3115/v1/d14-1181.

  16. Hassan A, Mahmood A. Convolutional recurrent deep learning model for sentence classification. IEEE Access. 2018;6:13949–57. https://doi.org/10.1109/ACCESS.2018.2814818.

    Article  Google Scholar 

  17. Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput. 1997;9(8):1735–80. https://doi.org/10.1162/neco.1997.9.8.1735.

    Article  Google Scholar 

  18. Xu G, Meng Y, Qiu X, Yu Z, Wu X. Sentiment analysis of comment texts based on BiLSTM. IEEE Access. 2019;7:51522–32. https://doi.org/10.1109/ACCESS.2019.2909919.

    Article  Google Scholar 

  19. Cho K, Merrienboer BV, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics; 2014. p. 1724–34. https://doi.org/10.3115/v1/D14-1179.

  20. Shobana J, Murali M. An efficient sentiment analysis methodology based on long short-term memory networks. Complex Intell Syst. 2021;7:2485–501. https://doi.org/10.1007/s40747-021-00436-4.

    Article  Google Scholar 

  21. Zhou P, Shi W, Tian J, Qi Z, Li B, Hao H, Xu B. Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics; 2016. vol. 2, p. 207–212. https://doi.org/10.18653/v1/P16-2034.

  22. Yang Z, Yang D, Dyer C, He X, Smola A, Hovy E. Hierarchical attention networks for document classification. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics Human Language Technologies: Human Language Technologies. Association for Computational Linguistics; 2016. p. 1480–9. https://doi.org/10.18653/v1/N16-1174.

  23. Basiri ME, Nemati S, Abdar M, Cambria C, Acharya UR. ABCDM: an attention-based bidirectional CNN-RNN deep model for sentiment analysis. Future Gener Comput Syst. 2021;115:279–94.

    Article  Google Scholar 

  24. Li W, Zhu L, Cambria E. Taylor’s theorem: a new perspective for neural tensor networks. Knowl Based Syst. 2021;228:107258.

    Article  Google Scholar 

  25. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser L, Polosukhin I. Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. Curran Associates, Inc.; 2017. p. 5998–6008.

  26. Jiang F, Cui A, Liu Y, Ma S. Every term has sentiment: learning from emoticon evidences for Chinese microblog sentiment analysis. In: Zhou G, Li J, Zhao D, Feng Y, editors. Proceedings of the second CCF conference on Natural Language Processing and Chinese Computing, vol. 400. Heidelberg: Springer; 2013. p. 224–235. https://doi.org/10.1007/978-3-642-41644-6_21.

  27. Cambria E, Li Y, Xing FZ , Poria S, Kwok K. SenticNet 6: ensemble application of symbolic and subsymbolic AI for sentiment analysis. In: The 29th ACM International Conference on Information and Knowledge Management. New York: Association for Computing Machinery; 2020. p. 105–14.

  28. Rathan M, Hulipalled VR, Venugopal KR, Patnaik LM. Consumer insight mining: aspect based Twitter opinion mining of mobile phone reviews. Appl Soft Comput. 2017;68:765–73. https://doi.org/10.1016/j.asoc.2017.07.056.

    Article  Google Scholar 

  29. Li D, Rzepka R, Ptaszynski M, Araki K. HEMOS: a novel deep learning-based fine-grained humor detecting method for sentiment analysis of social media. Inf Process Manag. 2020;57(6): 102290. https://doi.org/10.1016/j.ipm.2020.102290.

    Article  Google Scholar 

  30. Belkin M, Niyogi P. Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Proceedings of Advances in Neural Information Processing Systems. MIT Press; 2001. p. 585–91.

  31. Tenenbaum J, Silva VD, Langford JC. A global geometric framework for nonlinear dimensionality reduction. Science. 2000;290(5500):2319–23.

    Article  Google Scholar 

  32. Roweis S, Saul L. Nonlinear dimensionality reduction by locally linear embedding. Science. 2000;290(5500):2323–6. https://doi.org/10.1126/science.290.5500.2323.

    Article  Google Scholar 

  33. Ou M, Cui P, Pei J, Zhang Z, Zhu W. Asymmetric transitivity preserving graph embedding. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: Association for Computing Machinery; 2016. p. 1105–14. https://doi.org/10.1145/2939672.2939751.

  34. Perozzi B, AI-Rfou R, Skiena S. Deepwalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: Association for Computing Machinery; 2014. p. 701–10. https://doi.org/10.1145/2623330.2623732.

  35. Grover A, Leskovec J. Node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: Association for Computing Machinery; 2016. p. 855–64. https://doi.org/10.1145/2939672.2939754.

  36. Wang D, Cui P, Zhu W. Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: Association for Computing Machinery; 2016. p. 1225–34. https://doi.org/10.1145/2939672.2939753.

  37. Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. arXiv preprint; 2017. https://arxiv.org/abs/1609.02907.

  38. Schlichtkrull M, Kipf TN, Bloem P, Berg RV, Welling M. Modeling relational data with graph convolutional networks. In: Gangemi A, et al., editors. ESWC 2018: The Semantic Web, vol. 10843. Cham: Springer; 2018. p. 593–607. https://doi.org/10.1007/978-3-319-93417-4_38.

    Chapter  Google Scholar 

  39. Berg R, Kipf TN, Welling M. Graph convolutional matrix completion. arXiv preprint; 2017. https://arxiv.org/abs/1706.02263.

  40. Bruna J, Zaremba W, Szlam A, Lecun Y. Spectral networks and locally connected networks on graphs. In: Proceedings of the 2nd International Conference on Learning Representations; 2014.

  41. Li R, Wang S, Zhu F, Huang J. Adaptive graph convolutional neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2018;32(1). https://ojs.aaai.org/index.php/AAAI/article/view/11691.

  42. Zhuang C, Ma Q. Dual graph convolutional networks for graph-based semi-supervised classification. In: Proceedings of the 2018 World Wide Web Conference (WWW '18). International World Wide Web Conferences Steering Committee; 2018. p. 499–508. https://doi.org/10.1145/3178876.3186116.

  43. Hamilton W, Ying R, Leskovec J. Inductive representation learning on large graphs. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. NIPS; 2017. p. 1025–35.

  44. Velickovic P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y. Graph attention networks. In: Proceedings of the 6th International Conference on Learning Representations; 2018.

  45. Yao L, Mao C, Luo Y. Graph convolutional networks for text classification. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2019;33(1):7370–7. https://doi.org/10.1609/aaai.v33i01.33017370.

  46. Huang L, Ma D, Li S, Zhang X, Wang H. Text level graph neural network for text classification. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics; 2019. p. 3442–8. https://doi.org/10.18653/v1/D19-1345.

  47. Gu S, Zhang L, Hou Y, Song Y. A position-aware bidirectional attention net-work for aspect-level sentiment analysis. In: Proceedings of the 27th International Conference on Computational Linguistics. Association for Computational Linguistics; 2018. p. 774–84.

  48. Schlichtkrull M, Kipf TN, Bloem P, Berg RV, Titov I, Welling M. Modeling relational data with graph convolutional networks. In: Gangemi A, et al., editors. ESWC 2018: The Semantic Web. Cham: Springer; 2018. p. 593–607. https://doi.org/10.1007/978-3-319-93417-4_38.

    Chapter  Google Scholar 

  49. Zhang M, Qian T. Convolution over hierarchical syntactic and lexical graphs for aspect level sentiment analysis. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics; 2020. p. 3540–9. https://doi.org/10.18653/v1/2020.emnlp-main.286.

  50. Wang X, Ji H, Shi C, Wang B, Cui P, Yu PS, et al. Heterogeneous graph attention network. In: The World Wide Web Conference (WWW '19). Association for Computing Machinery; 2019. p. 2022–32. https://doi.org/10.1145/3308558.3313562.

  51. Bhagat R, Muralidharan S, Lobzhanidze A, Vishwanath S. Buy it again: Modeling repeat purchase recommendations. In: KDD. Association for Computing Machinery; 2018. p. 62–70.

  52. Zhang Y, Yu X, Cui Z, Wu S, Wen Z, Wang L. Every document owns its structure: inductive text classification via graph neural networks. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. ACL; 2020. p. 334–9. https://doi.org/10.18653/v1/2020.acl-main.31.

Download references

Funding

This study was funded by the National Natural Science Foundation of China (grant number 71502125).

Author information

Authors and Affiliations

Authors

Contributions

This work was carried out in close collaboration between all co-authors. Zhigang Jin, Manyue Tao, and Xiaofang Zhao first defined the research theme and contributed an early design of the method. Zhigang Jin, Manyue Tao, Xiaofang Zhao, and Yi Hu further implemented and refined the method and wrote the paper. All authors have contributed to, seen, and approved the final manuscript.

Corresponding author

Correspondence to Zhigang Jin.

Ethics declarations

Ethics Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jin, Z., Tao, M., Zhao, X. et al. Social Media Sentiment Analysis Based on Dependency Graph and Co-occurrence Graph. Cogn Comput 14, 1039–1054 (2022). https://doi.org/10.1007/s12559-022-10004-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-022-10004-8

Keywords

Navigation