Skip to main content
Log in

Simplified-Boosting Ensemble Convolutional Network for Text Classification

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Graph convolutional network (GCN) has a strong ability to extract the global feature but neglects the order of the words, thus leading to its weak effect on short text classification. In contrast, convolutional neural network (CNN) can capture the local contextual information within a sentence. There are few methods that can effectively classify both long text and short text. Therefore, we propose an ensemble convolutional network by combining GCN and CNN. In our method, GCN catches the global information and CNN extracts local features. Besides, we propose a simplified boosting algorithm, which makes CNN learn the samples misclassified by GCN again to improve classification performance and reduce the training time of the network. The results on four benchmark datasets show that our framework achieves better performance than other state-of-the-art methods with less memory.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. http://disi.unitn.it/moschitti/corpora.htm.

  2. https://www.cs.umb.edu/~smimarog/textmining/datasets/.

  3. http://www.cs.cornell.edu/people/pabo/movie-review-data/.

References

  1. Battaglia PW, Hamrick JB, Bapst V, Sanchez-Gonzalez A, Zambaldi V, Malinowski M, Tacchetti A, Raposo D, Santoro A, Faulkner R, et al (2018) Relational inductive biases, deep learning, and graph networks. arXiv:1806.01261

  2. Cheng W, Greaves C, Warren M (2006) From n-gram to skipgram to concgram. Int J Corpus Linguist 11(4):411–433

    Article  Google Scholar 

  3. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  4. Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:181004805

  5. Elakkiya E, Selvakumar S, Velusamy RL (2020) Textspamdetector: textual content based deep learning framework for social spam detection using conjoint attention mechanism. J Ambient Intell Human Comput 12:1–16

  6. Gao H, Chen Y, Ji S (2019) Learning graph pooling and hybrid convolutional operations for text representations. In: The world wide web conference, pp 2743–2749

  7. Haonan L, Huang SH, Ye T, Xiuyan G (2019) Graph star net for generalized multi-task learning. arXiv:190612330

  8. Huang L, Ma D, Li S, Zhang X, Wang H (2019) Text level graph neural network for text classification. arXiv:191002356

  9. James F (2005) The elements of statistical learning: data mining, inference and prediction. Math Intell 27(2):83–85

  10. Kim Y (2014) Convolutional neural networks for sentence classification. corr abs/1408.5882. arXiv:14085882

  11. Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. arXiv:14126980

  12. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:160902907

  13. Kuznetsov V, Mochalov V, Mochalova A (2015) Ontological-semantic text analysis and the question answering system using data from ontology. In: 2016 18th International conference on advanced communication technology (ICACT). IEEE, pp 651–658

  14. Li F, Zhang M, Fu G, Qian T, Ji D (2016) A bi-lstm-rnn model for relation classification using low-cost sequence features. arXiv:160807720

  15. Li Q, Peng H, Li J, Xia C, Yang R, Sun L, Yu PS, He L (2020) A survey on text classification: from shallow to deep learning. arXiv:200800364

  16. Liang J, Deng Y, Zeng D (2020) A deep neural network combined cnn and gcn for remote sensing scene classification. IEEE J Sel Top Appl Earth Observ Remote Sens 13:4325–4338

    Article  Google Scholar 

  17. Lin Y, Meng Y, Sun X, Han Q, Kuang K, Li J, Wu F (2021) Bertgcn: transductive text classification by combining gcn and bert. arXiv:210505727

  18. Liu G, Li B, Hu W, Yang J (2013) Horror text recognition based on generalized expectation criteria. In: international conference on intelligent science and big data engineering. Springer, pp 136–142

  19. Liu P, Qiu X, Huang X (2016) Recurrent neural network for text classification with multi-task learning. arXiv:160505101

  20. Lu Z, Du P, Nie JY (2020) Vgcn-bert: augmenting bert with graph embedding for text classification. In: European conference on information retrieval. Springer, pp 369–382

  21. Novotnỳ V, Ayetiran EF, Štefánik M, Sojka P (2020) Text classification with word embedding regularization and soft similarity measure. arXiv:200305019

  22. Peng H, Li J, He Y, Liu Y, Bao M, Wang L, Song Y, Yang Q (2018) Large-scale hierarchical text classification with recursively regularized deep graph-cnn. In: Proceedings of the 2018 world wide web conference, pp 1063–1072

  23. Peng H, Li J, Wang S, Wang L, Gong Q, Yang R, Li B, Yu P, He L (2019) Hierarchical taxonomy-aware and attentional graph capsule rcnns for large-scale multi-label text classification. IEEE Trans Knowl Data Eng 33(6):2505–2519

  24. Shao K, Zhang Z, He S, Bo X (2020) Dtigccn: prediction of drug-target interactions based on gcn and cnn. In: 2020 IEEE 32nd international conference on tools with artificial intelligence (ICTAI). IEEE, pp 337–342

  25. Socher R, Perelygin A, Wu J, Chuang J, Manning CD, Ng AY, Potts C (2013) Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing, pp 1631–1642

  26. Tong X, Wu B, Wang S, Lv J (2018) A complaint text classification model based on character-level convolutional network. In: 2018 IEEE 9th international conference on software engineering and service science (ICSESS). IEEE, pp 507–511

  27. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. arXiv:170603762

  28. Wang SI, Manning CD (2012) Baselines and bigrams: Simple, good sentiment and topic classification. In: Proceedings of the 50th annual meeting of the association for computational linguistics (volume 2: short papers), pp 90–94

  29. Wu F, Souza A, Zhang T, Fifty C, Yu T, Weinberger K (2019) Simplifying graph convolutional networks. In: International conference on machine learning, PMLR, pp 6861–6871

  30. Xu B, Guo X, Ye Y, Cheng J (2012) An improved random forest classifier for text categorization. JCP 7(12):2913–2920

    Google Scholar 

  31. Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2020) Inductive representation learning on temporal graphs. arXiv:200207962

  32. Yadav RK, Jiao L, Granmo OC, Goodwin M (2021) Enhancing interpretable clauses semantically using pretrained word representation. In: Proceedings of the fourth BlackboxNLP workshop on analyzing and interpreting neural networks for NLP, pp 265–274

  33. Yao L, Mao C, Luo Y (2019) Graph convolutional networks for text classification. Proc AAAI Conf Artif Intell 33:7370–7377

    Google Scholar 

  34. Zaheer M, Guruganesh G, Dubey A, Ainslie J, Alberti C, Ontanon S, Pham P, Ravula A, Wang Q, Yang L, et al. (2020) Big bird: transformers for longer sequences. arXiv:200714062

  35. Zhang YD, Satapathy SC, Guttery DS, Górriz JM, Wang SH (2021) Improved breast cancer classification through combining graph convolutional network and convolutional neural network. Inf Process Manag 58(2):102439

    Article  Google Scholar 

  36. Zhou J, Cui G, Hu S, Zhang Z, Yang C, Liu Z, Wang L, Li C, Sun M (2020) Graph neural networks: a review of methods and applications. AI Open 1:57–81

    Article  Google Scholar 

  37. Zhu H, Koniusz P (2021) Simple spectral graph convolution. In: International conference on learning representations

  38. Zukov-Gregoric A, Bachrach Y, Minkovsky P, Coope S, Maksak B (2017) Neural named entity recognition using a self-attention mechanism. In: 2017 IEEE 29th international conference on tools with artificial intelligence (ICTAI). IEEE, pp 652–656

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhigang Meng.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zeng, F., Chen, N., Yang, D. et al. Simplified-Boosting Ensemble Convolutional Network for Text Classification. Neural Process Lett 54, 4971–4986 (2022). https://doi.org/10.1007/s11063-022-10843-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-022-10843-4

Keywords

Navigation