Skip to main content

JumpLiteGCN: A Lightweight Approach to Hierarchical Text Classification

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2024)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 15362))

  • 140 Accesses

Abstract

Hierarchical text classification poses a significant challenge in natural language processing due to its intricate label hierarchy. Existing text classification methods often face dual constraints of efficiency and performance. To overcome these challenges, this study proposes a lightweight graph convolutional network model enhanced with jump connections (JumpLiteGCN). This significantly reduces the model’s complexity and computational costs by simplifying the network structure. Moreover, integrating jump connection mechanisms enhances the flow of information in deep networks, better capturing and utilizing hierarchical label information, thus significantly improving classification accuracy. In addition, we propose an adaptive loss function weight calculation method that computes the label weights based on hierarchical relationships and applies them to loss function, enabling the model to focus more on the accurate prediction of important samples during training, further enhancing the model’s performance and generalization ability. Extensive experiments conducted on two public hierarchical text classification datasets demonstrate that our method surpasses existing state-of-the-art approaches across multiple key performance metrics while significantly reducing the model’s training and inference times.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Wang, Z., Wang, P., Huang, L., et al.: Incorporating hierarchy into text encoder: a contrastive learning approach for hierarchical text classification. arXiv preprint arXiv:2203.03825 (2022)

  2. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  3. Zhou, J., Ma, C., Long, D., et al.: Hierarchy-aware global model for hierarchical text classification. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 1106–1117 (2020)

    Google Scholar 

  4. Banerjee, S., Akkaya, C., Perez-Sorrosal, F., et al.: Hierarchical transfer learning for multi-label text classification. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 6295–6300 (2019)

    Google Scholar 

  5. Wehrmann, J., Cerri, R., Barros, R.: Hierarchical multi-label classification networks. In: International Conference on Machine Learning, pp. 5075–5084. PMLR (2018)

    Google Scholar 

  6. Chen, B., Huang, X., Xiao, L., et al.: Hyperbolic interaction model for hierarchical multi-label classification. Proc. AAAI Conf. Artif. Intell. 34(05), 7496–7503 (2020)

    Google Scholar 

  7. Deng, Z., Peng, H., He, D., et al.: HTCInfoMax: a global model for hierarchical text classification via information maximization. arXiv preprint arXiv:2104.05220 (2021)

  8. Zhao, R., Wei, X., Ding, C., Chen, Y.: Hierarchical multi-label text classification: self-adaption semantic awareness network integrating text topic and label level information. In: Qiu, H., Zhang, C., Fei, Z., Qiu, M., Kung, S.-Y. (eds.) KSEM 2021. LNCS (LNAI), vol. 12816, pp. 406–418. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-82147-0_33

    Chapter  Google Scholar 

  9. Wang, B., Hu, X., Li, P., et al.: Cognitive structure learning model for hierarchical multi-label text classification. Knowl.-Based Syst. 218, 106876 (2021)

    Article  Google Scholar 

  10. Bruna, J., Zaremba, W., Szlam, A., et al.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013)

  11. Peng, H., Li, J., He, Y., et al.: Large-scale hierarchical text classification with recursively regularized deep graph-CNN. In: Proceedings of the 2018 World Wide Web Conference, pp. 1063–1072 (2018)

    Google Scholar 

  12. Chen, H., Ma, Q., Lin, Z., et al.: Hierarchy-aware label semantics matching network for hierarchical text classification. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 4370–4379 (2021)

    Google Scholar 

  13. Ying, C., Cai, T., Luo, S., et al.: Do transformers really perform badly for graph representation. Adv. Neural. Inf. Process. Syst. 34, 28877–28888 (2021)

    Google Scholar 

  14. Meng, Y., Xiong, C., Bajaj, P., et al.: COCO-LM: correcting and contrasting text sequences for language model pretraining. Adv. Neural. Inf. Process. Syst. 34, 23102–23114 (2021)

    Google Scholar 

  15. Pan, L., Hang, C.W., Sil, A., et al.: Improved text classification via contrastive adversarial training. Proc. AAAI Conf. Artif. Intell. 36(10), 11130–11138 (2022)

    Google Scholar 

Download references

Acknowledgments

This work is supported by the National Key R&D Program of China (Grant No. 2023YFC3306304), “20 New Universities” Project of Jinan City (No. 202228077, 2021GXRC123), Major Innovation Projects of the Pilot Project of Science, education and industry integration (2022JBZ01-01), Taishan Industrial Experts Program (NO. tscy20231203), Taishan Scholars Program (tsqn202211203), and Shandong Provincial Natural Science Foundation Innovation and Development Joint Fund Project (ZR2023LLZ014).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaoming Wu .

Editor information

Editors and Affiliations

Ethics declarations

Disclosure of Interests

The authors have no competing interests to declare that are relevant to the content of this article.

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, T., Liu, X., Dong, Y., Wu, X. (2025). JumpLiteGCN: A Lightweight Approach to Hierarchical Text Classification. In: Wong, D.F., Wei, Z., Yang, M. (eds) Natural Language Processing and Chinese Computing. NLPCC 2024. Lecture Notes in Computer Science(), vol 15362. Springer, Singapore. https://doi.org/10.1007/978-981-97-9440-9_5

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-9440-9_5

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-9439-3

  • Online ISBN: 978-981-97-9440-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics