Skip to main content

An End-to-End Dense Connected Heterogeneous Graph Convolutional Neural Network

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14447))

Included in the following conference series:

  • 660 Accesses

Abstract

Graph convolutional networks (GCNs) are powerful models for graph-structured data learning task. However, most existing GCNs may confront with two major challenges when dealing with heterogeneous graph: (1) Predefined meta-paths are required to capture the semantic relations between nodes from different types, which may not exploit all the useful information in the graph; (2) Performance degradation and semantic confusion may happen with the growth of the network depth, which limits their ability to capture long-range dependencies. To meet these challenges, we propose Dense-HGCN, an end-to-end dense connected heterogeneous convolutional neural network to learn node representation. Dense-HGCN computes the attention weights between different nodes and incorporates the information of previous layers into each layer’s aggregation process via a specific fuse function. Moreover, Dense-HGCN leverages multi-scale information for node classification or other downstream tasks. Experimental results on real-world datasets demonstrate the superior performance of Dense-HGCN in enhancing the representational power compared with several state-of-the-art methods.

The work described in this paper was supported partially by the National Natural Science Foundation of China (12271111), Special Support Plan for High Level Talents of Guangdong Province (2019TQ05X571), Foundation of Guangdong Educational Committee (2019KZDZX1023), Project of Guangdong Province Innovative Team (2020WCXTD011).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ahn, H., Yang, Y., Gan, Q., Moon, T., Wipf, D.P.: Descent steps of a relation-aware energy produce heterogeneous graph neural networks. In: Conference on Neural Information Processing Systems, pp. 38436–38448. NIPS, Curran Associates, Inc. (2022)

    Google Scholar 

  2. Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. In: International Conference on Learning Representation, ICLR (2014)

    Google Scholar 

  3. Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Conference on Neural Information Processing Systems, pp. 3844–3852. NIPS, Curran Associates Inc., Red Hook (2016)

    Google Scholar 

  4. Dong, Y., Chawla, N.V., Swami, A.: Metapath2vec: scalable representation learning for heterogeneous networks. In: ACM International Conference on Knowledge Discovery and Data Mining, KDD, pp. 135–144. Association for Computing Machinery, New York (2017)

    Google Scholar 

  5. Feng, J., Wang, Z., Li, Y., Ding, B., Wei, Z., Xu, H.: MGMAE: molecular representation learning by reconstructing heterogeneous graphs with A high mask ratio. In: Hasan, M.A., Xiong, L. (eds.) ACM International Conference on Information & Knowledge Management, CIKM, pp. 509–519. ACM (2022)

    Google Scholar 

  6. Fu, X., Zhang, J., Meng, Z., King, I.: MAGNN: metapath aggregated graph neural network for heterogeneous graph embedding. In: International World Wide Web Conference, WWW, pp. 2331–2341. Association for Computing Machinery, New York (2020)

    Google Scholar 

  7. Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. In: Conference on Neural Information Processing Systems, pp. 1025–1035. NIPS, Curran Associates Inc., Red Hook (2017)

    Google Scholar 

  8. Huang, G., Liu, Z., Van Der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR, pp. 2261–2269 (2017)

    Google Scholar 

  9. Ji, H., Wang, X., Shi, C., Wang, B., Yu, P.S.: Heterogeneous graph propagation network. IEEE Trans. Knowl. Data Eng. 35(1), 521–532 (2023)

    Google Scholar 

  10. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: International Conference on Learning Representations, ICLR (2015)

    Google Scholar 

  11. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: International Conference on Learning Representations, ICLR (2017)

    Google Scholar 

  12. Liu, J., Kawaguchi, K., Hooi, B., Wang, Y., Xiao, X.: EIGNN: efficient infinite-depth graph neural networks. In: Conference on Neural Information Processing Systems, NIPS, pp. 18762–18773 (2021)

    Google Scholar 

  13. Luan, S., Zhao, M., Chang, X.W., Precup, D.: Break the ceiling: stronger multi-scale deep graph convolutional networks. In: Conference on Neural Information Processing SystemsNIPS, , pp. 10943–10953. Curran Associates, Inc. (2019)

    Google Scholar 

  14. Sun, Y., Han, J.: Mining heterogeneous information networks: a structural analysis approach. ACM SIGKDD Explor. Newsl. 14(2), 20–28 (2013)

    Article  Google Scholar 

  15. Sun, Y., Han, J., Yan, X., Yu, P.S., Wu, T.: Pathsim: meta path-based top-k similarity search in heterogeneous information networks. Proc. VLDB Endow. 4(11), 992–1003 (2011)

    Article  Google Scholar 

  16. Vaswani, A., et al.: Attention is all you need. In: Conference on Neural Information Processing Systems, NIPS, pp. 6000–6010. Curran Associates Inc., Red Hook (2017)

    Google Scholar 

  17. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lió, P., Bengio, Y.: Graph attention networks. In: International Conference on Learning Representations, ICLR (2018)

    Google Scholar 

  18. Wang, X., et al.: Heterogeneous graph attention network. In: World Wide Web Conference, WWW, pp. 2022–2032. Association for Computing Machinery, New York (2019)

    Google Scholar 

  19. Xiong, Z., Cai, J.: Deep heterogeneous graph neural networks via similarity regularization loss and hierarchical fusion. In: IEEE International Conference on Data Mining Workshops, ICDMW, pp. 759–768(2022)

    Google Scholar 

  20. Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, ICML, vol. 80, pp. 5449–5458. PMLR (2018)

    Google Scholar 

  21. Yang, Y., Guan, Z., Li, J., Zhao, W., Cui, J., Wang, Q.: Interpretable and efficient heterogeneous graph convolutional network. IEEE Trans. Knowl. Data Eng. 35(2), 1637–1650 (2023)

    Google Scholar 

  22. Yun, S., Jeong, M., Kim, R., Kang, J., Kim, H.J.: Graph transformer networks. In: Neural Information Processing Systems. Curran Associates Inc., Red Hook (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jia Cai .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yan, R., Cai, J. (2024). An End-to-End Dense Connected Heterogeneous Graph Convolutional Neural Network. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Lecture Notes in Computer Science, vol 14447. Springer, Singapore. https://doi.org/10.1007/978-981-99-8079-6_36

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8079-6_36

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8078-9

  • Online ISBN: 978-981-99-8079-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics