Skip to main content
Log in

Multi-level disentanglement graph neural network

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Real-world graphs are generally generated from highly entangled latent factors. However, existing deep learning methods for graph-structured data often ignore such entanglement and simply denote the heterogeneous relations between entities as binary edges. In this paper, we propose a novel Multi-level Disentanglement Graph Neural Network (MD-GNN), a unified framework that simultaneously implements edge-level, attribute-level, and node-level disentanglement in an end-to-end manner. MD-GNN takes the original graph structure and node attributes as input and outputs multiple disentangled relation graphs and disentangled node representations. Specifically, MD-GNN first disentangles the original graph structure into multiple relation graphs, each of which corresponds to a latent and disentangled relation among entities. The input node attributes are then propagated in the corresponding relation graph through a multi-hop diffusion mechanism to capture long-range dependencies between entities, and finally the disentangled node representations are obtained through information aggregation and merging. Extensive experiments on synthetic and real-world datasets have shown qualitatively and quantitatively that MD-GNN yields truly encouraging results in terms of disentanglement and also serves well as a general GNN framework for downstream tasks. Code has been made available at: https://github.com/LirongWu/MD-GNN.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. Different from image data, node attributes in graphs are usually characterized as vectors with each dimension representing a specific meaning. This suggests that the correlations between attributes and relations should be node-independent.

References

  1. Brock A, Donahue J, Simonyan K (2018) Large scale gan training for high fidelity natural image synthesis. arXiv preprint arXiv:1809.11096

  2. Chen X, Duan Y, Houthooft R, Schulman J, Sutskever I, Abbeel P (2016) Infogan: interpretable representation learning by information maximizing generative adversarial nets. Adv Neural Inf Process Syst 29:2172–2180

    Google Scholar 

  3. Chen Z, Villar S, Chen L, Bruna J (2019) On the equivalence between graph isomorphism testing and function approximation with gnns. arXiv preprint arXiv:1905.12560

  4. Chien E, Peng J, Li P, Milenkovic O (2020) Adaptive universal generalized pagerank graph neural network. arXiv preprint arXiv:2006.07988

  5. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. arXiv preprint arXiv:1606.09375

  6. Dwivedi VP, Joshi CK, Laurent T, Bengio Y, Bresson X (2020) Benchmarking graph neural networks. arXiv preprint arXiv:2003.00982

  7. Feng Z, Wang X, Ke C, Zeng AX, Tao D, Song M (2018) Dual swap disentangling. Adv Neural Inf Process Syst, pp. 5894–5904

  8. Giles CL, Bollacker KD, Lawrence S (1998) Citeseer: An automatic citation indexing system. In: proceedings of the third ACM conference on Digital libraries, pp. 89–98

  9. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. Adv Neural Inf Process Syst, pp. 1024–1034

  10. Higgins I, Matthey L, Pal A, Burgess C, Glorot X, Botvinick M, Mohamed S, Lerchner A (2016) beta-vae: learning basic visual concepts with a constrained variational framework

  11. Jiang X, Zhu R, Li S, Ji P (2020) Co-embedding of nodes and edges with graph neural networks. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2020.3029762

    Article  Google Scholar 

  12. Jin W, Barzilay R, Jaakkola T (2018) Junction tree variational autoencoder for molecular graph generation. arXiv preprint arXiv:1802.04364

  13. Kingma DP, Welling M (2013) Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114

  14. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907

  15. Kipf TN, Welling M (2016) Variational graph auto-encoders. arXiv preprint arXiv:1611.07308

  16. Klicpera J, Bojchevski A, Günnemann S (2018) Predict then propagate: graph neural networks meet personalized pagerank. arXiv preprint arXiv:1810.05997

  17. Li P, Wang Y, Wang H, Leskovec J (2020) Distance encoding: design provably more powerful neural networks for graph representation learning. arXiv preprint arXiv:2009.00142

  18. Liao R, Zhao Z, Urtasun R, Zemel RS (2019) Lanczosnet: Multi-scale deep graph convolutional networks. arXiv preprint arXiv:1901.01484

  19. Lin L, Wang H (2020) Graph attention networks over edge content-based channels. In: proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pp. 1819–1827

  20. Liu F, Xue S, Wu J, Zhou C, Hu W, Paris C, Nepal S, Yang J, Yu PS (2020) Deep learning for community detection: progress, challenges and opportunities. arXiv preprint arXiv:2005.08225

  21. Liu Y, Wang X, Wu S, Xiao Z (2020) Independence promoted graph disentangled networks. In: AAAI, pp. 4916–4923

  22. Locatello F, Bauer S, Lucic M, Raetsch G, Gelly S, Schölkopf B, Bachem O (2019) Challenging common assumptions in the unsupervised learning of disentangled representations. In: international conference on machine learning, pp. 4114–4124. PMLR

  23. Luan S, Zhao M, Chang XW, Precup D (2019) Break the ceiling: stronger multi-scale deep graph convolutional networks. Adv Neural Inf Process Syst 32:10945–10955

    Google Scholar 

  24. Ma J, Cui P, Kuang K, Wang X, Zhu W (2019) Disentangled graph convolutional networks. In: international conference on machine learning, pp. 4212–4221

  25. Ma X, Wu J, Xue S, Yang J, Zhou C, Sheng QZ, Xiong H, Akoglu L (2021) A comprehensive survey on graph anomaly detection with deep learning. IEEE Trans Knowl Data Eng

  26. Maron H, Ben-Hamu H, Serviansky H, Lipman Y (2019) Provably powerful graph networks. arXiv preprint arXiv:1905.11136

  27. McCallum AK, Nigam K, Rennie J, Seymore K (2000) Automating the construction of internet portals with machine learning. Inf Retr 3(2):127–163

    Article  Google Scholar 

  28. Monti F, Boscaini D, Masci J, Rodola E, Svoboda J, Bronstein MM (2017) Geometric deep learning on graphs and manifolds using mixture model cnns. In: proceedings of the IEEE conference on computer vision and pattern recognition, pp. 5115–5124

  29. Schmidhuber J (1992) Learning factorial codes by predictability minimization. Neural Comput 4(6):863–879

    Article  Google Scholar 

  30. Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T (2008) Collective classification in network data. AI Mag 29(3):93–93

    Google Scholar 

  31. Shuman DI, Narang SK, Frossard P, Ortega A, Vandergheynst P (2013) The emerging field of signal processing on graphs: extending high-dimensional data analysis to networks and other irregular domains. IEEE Signal Process Mag 30(3):83–98

    Article  Google Scholar 

  32. Su X, Xue S, Liu F, Wu J, Yang J, Zhou C, Hu W, Paris C, Nepal S, Jin Dz et al. (2021) A comprehensive survey on community detection with deep learning. arXiv preprint arXiv:2105.12584

  33. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. arXiv preprint arXiv:1710.10903

  34. Wang G, Ying R, Huang J, Leskovec J (2020) Direct multi-hop attention based graph neural network. arXiv preprint arXiv:2009.14332

  35. Wang G, Ying Z, Huang J, Leskovec J (2020) Multi-hop attention graph neural network

  36. Wang X, Türetken E, Fleuret F, Fua P (2014) Tracking interacting objects optimally using integer programming. In: European conference on computer Vision, pp. 17–32. Springer

  37. Wang X, Türetken E, Fleuret F, Fua P (2015) Tracking interacting objects using intertwined flows. IEEE Trans Pattern Anal Mach Intell 38(11):2312–2326

    Article  Google Scholar 

  38. Xu B, Shen H, Cao Q, Cen K, Cheng X (2020) Graph convolutional networks using heat kernel for semi-supervised learning. arXiv preprint arXiv:2007.16002

  39. Xu B, Shen H, Cao Q, Qiu Y, Cheng X (2019) Graph wavelet neural network. arXiv preprint arXiv:1904.07785

  40. Yang Y, Feng Z, Song M, Wang X (2020) Factorizable graph convolutional networks. Adv Neural Inf Process Syst 33

  41. You J, Gomes-Selman J, Ying R, Leskovec J (2021) Identity-aware graph neural networks. arXiv preprint arXiv:2101.10320

  42. You J, Ying R, Leskovec J (2019) Position-aware graph neural networks. In: International conference on machine learning, pp. 7134–7143. PMLR

  43. Zheng S, Zhu Z, Liu Z, Ji S, Zhao Y (2021) Adversarial graph disentanglement. arXiv preprint arXiv:2103.07295

  44. Zhu Y, Xu W, Zhang J, Liu Q, Wu S, Wang L (2021) Deep graph structure learning for robust representations: a survey. arXiv preprint arXiv:2103.03036

Download references

Acknowledgments

 This work is supported in part by the Science and Technology Innovation 2030- Major Project (No. 2021ZD0150100) and National Natural Science Foundation of China (No. U21A20427).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lirong Wu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, L., Lin, H., Xia, J. et al. Multi-level disentanglement graph neural network. Neural Comput & Applic 34, 9087–9101 (2022). https://doi.org/10.1007/s00521-022-06930-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-06930-1

Keywords

Navigation