Abstract
Nowadays, Graph Convolution Networks (GCNs) have flourished in the field of text classification, such as Text Graph Convolution Network (TextGCN). But good performance of those methods is based on building a graph whose nodes consist of an entire corpus, making their models transductive. Meanwhile rich label information has not been utilized in the graph structure. In this paper, we propose a new model named Inductive Light Graph Convolution Networks (ILGCN) with a new construction of graph. This approach uses labels and words to build the graph which removes the dependence between an individual text and entire corpus, and let ILGCN inductive. Besides, we simplify the model structure and only remain the neighborhood aggregation, which is the most important part of GCNs. Experiments on multiple benchmark show that our model outperforms existing state-of-the-art models on several text classification datasets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Jindal, N., Liu, B.: Review spam detection. In: Proceedings of the 16th International Conference on World Wide Web (2007)
Zeng, Z., Deng, Yu., Li, X., Naumann, T., Luo, Y.: Natural language processing for EHR-based computational phenotyping. IEEE/ACM Trans. Comput. Biol. Bioinf. 16(1), 139–153 (2019)
Joachims, T.: A probabilistic analysis of the Rocchio algorithm with TFIDF for text categorization. Department of Computer Science, Carnegie-Mellon University, Pittsburgh, PA (1996)
Zhang, Y., Jin, R., Zhou, Z.-H.: Understanding bag-of-words model: a statistical framework. Int. J. Mach. Learn. Cybern. 1(1–4), 43–52 (2010)
Wang, S.I., Manning, C.D.: Baselines and bigrams: simple, good sentiment and topic classification. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) (2012)
Kim, Y.: Convolutional neural networks for sentence classification. Eprint arXiv (2014)
Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. arXiv preprint arXiv:1404.2188 (2014)
Gehring, J., et al.: Convolutional sequence to sequence learning. In: International Conference on Machine Learning (PMLR) (2017)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Veličković, P., et al.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)
Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. arXiv preprint arXiv:1706.02216 (2017)
Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. arXiv preprint arXiv:1606.09375 (2016)
Yao, L., Mao, C., Luo, Y.: Graph convolutional networks for text classification. Proc. AAAI Conf. Artif. Intell. 33, 7370–7377 (2019)
Wu, F., et al.: Simplifying graph convolutional networks. In: International Conference on Machine Learning (PMLR) (2019)
Zhu, H., Koniusz, P.: Simple spectral graph convolution. In: International Conference on Learning Representations (2021)
He, X., et al.: LightGCN: simplifying and powering graph convolution network for recommendation. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (2020)
Wang, X., et al.: Neural graph collaborative filtering. In: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval (2019)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Bruna, J., et al.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013)
Henaff, M., Bruna, J., LeCun, Y.: Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163 (2015)
Micheli, A.: Neural network for graphs: a contextual constructive approach. IEEE Trans. Neural Netw. 20(3), 498–511 (2009)
Atwood, J., Towsley, D.: Diffusion-convolutional neural networks. In: Advances in Neural Information Processing Systems (2016)
Li, Y., et al.: Diffusion convolutional recurrent neural network: data-driven traffic forecasting. arXiv preprint arXiv:1707.01926 (2017)
Monti, F., et al.: Geometric deep learning on graphs and manifolds using mixture model CNNs. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2017)
Xu, K., et al.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018)
Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. Proc. AAAI Conf. Artif. Intell. 34(04), 3438–3445 (2020)
Ma, J., et al.: Disentangled graph convolutional networks. In: International Conference on Machine Learning (PMLR) (2019)
Wang, X., et al.: Disentangled graph collaborative filtering. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (2020)
Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Disc. 2(2), 121–167 (1998)
Leung, K.M.: Naive Bayesian classifier, pp. 123–156. Polytechnic University Department of Computer Science/Finance and Risk Engineering (2007)
Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)
Vaswani, A., et al.: Attention is all you need. arXiv preprint arXiv:1706.03762 (2017)
Devlin, J., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Acknowledgment
This work is supported by National Key Research and Development Project (2018YFE0119700), Key Research and Development Project of Shandong Province (2019JZZY010132, 2019-0101), the Natural Foundation of Shandong Province (ZR2018MF003), Plan of Youth Innovation Team Development of colleges and universities in Shandong Province (SD2019-161).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 IFIP International Federation for Information Processing
About this paper
Cite this paper
Shi, J., Wu, X., Liu, X., Lu, W., Li, S. (2022). Inductive Light Graph Convolution Network for Text Classification Based on Word-Label Graph. In: Shi, Z., Zucker, JD., An, B. (eds) Intelligent Information Processing XI. IIP 2022. IFIP Advances in Information and Communication Technology, vol 643. Springer, Cham. https://doi.org/10.1007/978-3-031-03948-5_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-03948-5_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-03947-8
Online ISBN: 978-3-031-03948-5
eBook Packages: Computer ScienceComputer Science (R0)