Skip to main content

Graph Contrastive Representation Learning with Input-Aware and Cluster-Aware Regularization

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases: Research Track (ECML PKDD 2023)

Abstract

With broad applications in network analysis and mining, Graph Contrastive Learning (GCL) is attracting growing research interest. Despite its successful usage in extracting concise but useful information through contrasting different augmented graph views as an outstanding self-supervised technique, GCL is facing a major challenge in how to make the semantic information extracted well-organized in structure and consequently easily understood by a downstream classifier. In this paper, we propose a novel cluster-based GCL framework to obtain a semantically well-formed structure of node embeddings via maximizing mutual information between input graph and output embeddings, which also provides a more clear decision boundary through accomplishing a cluster-level global-local contrastive task. We further argue in theory that the proposed method can correctly maximize the mutual information between an input graph and output embeddings. Moreover, we further improve the proposed method for better practical performance by incorporating additional refined gadgets, e.g., measuring uncertainty of clustering and additional structural information extraction via local-local node-level contrasting module enhanced by Graph Cut. Lastly, extensive experiments are carried out to demonstrate the practical performance gain of our method in six real-world datasets over the most prevalent existing state-of-the-art models.

This research was supported by the University-Industry Cooperation Project of Fujian Province, China (2023H6008) and the National Natural Science Foundation of China (12271098). Paper with appendix can be found at https://drive.google.com/file/d/1FVziwZpsq4v5oLvPz9qFr77ozkQwhvFw/view?usp=sharing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    E.g., spectral embeddings used in Spectral Clustering [33].

  2. 2.

    Because we use GLMIMax in cluster’s level instead of the whole graph.

References

  1. Chen, J., Ma, T., Xiao, C.: FastGCN: fast learning with graph convolutional networks via importance sampling. In: 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, 30 April–3 May 2018, Conference Track Proceedings. OpenReview.net (2018)

    Google Scholar 

  2. Ding, K., Xu, Z., Tong, H., Liu, H.: Data augmentation for deep graph learning: a survey. CoRR abs/2202.08235 (2022)

    Google Scholar 

  3. Ericsson, L., Gouk, H., Loy, C.C., Hospedales, T.M.: Self-supervised representation learning: introduction, advances, and challenges. IEEE Signal Process. Mag. 39(3), 42–62 (2022). https://doi.org/10.1109/msp.2021.3134634

    Article  Google Scholar 

  4. Errica, F., Podda, M., Bacciu, D., Micheli, A.: A fair comparison of graph neural networks for graph classification. In: 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, 26–30 April 2020. OpenReview.net (2020)

    Google Scholar 

  5. Grill, J.B., et al.: Bootstrap your own latent-a new approach to self-supervised learning. Adv. Neural. Inf. Process. Syst. 33, 21271–21284 (2020)

    Google Scholar 

  6. Hamilton, W.L., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4–9 December 2017, Long Beach, CA, USA, pp. 1024–1034. Curran Associates Inc. (2017)

    Google Scholar 

  7. Hassani, K., Ahmadi, A.H.K.: Contrastive multi-view representation learning on graphs. CoRR abs/2006.05582 (2020)

    Google Scholar 

  8. Hjelm, R.D., et al.: Learning deep representations by mutual information estimation and maximization. In: 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, 6–9 May 2019. OpenReview.net (2019)

    Google Scholar 

  9. Jaiswal, A., Babu, A.R., Zadeh, M.Z., Banerjee, D., Makedon, F.: A survey on contrastive self-supervised learning. Technologies 9(1), 2 (2021). https://doi.org/10.3390/technologies9010002

    Article  Google Scholar 

  10. Jin, M., Zheng, Y., Li, Y.F., Gong, C., Zhou, C., Pan, S.: Multi-scale contrastive siamese networks for self-supervised graph representation learning. In: International Joint Conference on Artificial Intelligence 2021, Paolo, Brazil, pp. 1477–1483. Association for the Advancement of Artificial Intelligence (AAAI), CEUR-WS.org (2021)

    Google Scholar 

  11. Karypis, G., Kumar, V.: A software package for partitioning unstructured graphs, partitioning meshes, and computing fill-reducing orderings of sparse matrices. University of Minnesota, Department of Computer Science and Engineering, Army HPC Research Center, Minneapolis, MN 38 (1998)

    Google Scholar 

  12. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Bengio, Y., LeCun, Y. (eds.) 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015, Conference Track Proceedings. Conference Track Proceedings (2015)

    Google Scholar 

  13. Kipf, T.N., Welling, M.: Variational graph auto-encoders. Stat 1050, 21 (2016)

    Google Scholar 

  14. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, 24–26 April 2017, Conference Track Proceedings. OpenReview.net (2017)

    Google Scholar 

  15. Lin, Z.: Some software packages for partial SVD computation. CoRR abs/1108.1548 (2011)

    Google Scholar 

  16. Mavromatis, C., Karypis, G.: Graph infoclust: leveraging cluster-level node information for unsupervised graph representation learning. CoRR abs/2009.06946 (2020)

    Google Scholar 

  17. Mernyei, P., Cangea, C.: Wiki-CS: a Wikipedia-based benchmark for graph neural networks. CoRR abs/2007.02901 (2020)

    Google Scholar 

  18. Olatunji, I.E., Funke, T., Khosla, M.: Releasing graph neural networks with differential privacy guarantees. CoRR abs/2109.08907 (2021)

    Google Scholar 

  19. Pan, L., Shi, C., Dokmanic, I.: Neural link prediction with walk pooling. CoRR abs/2110.04375 (2021)

    Google Scholar 

  20. Pan, S., Hu, R., Long, G., Jiang, J., Yao, L., Zhang, C.: Adversarially regularized graph autoencoder for graph embedding. In: Lang, J. (ed.) Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI 2018, 13–19 July 2018, Stockholm, Sweden, pp. 2609–2615. ijcai.org (2018)

    Google Scholar 

  21. Pan, S., Hu, R., Long, G., Jiang, J., Yao, L., Zhang, C.: Adversarially regularized graph autoencoder for graph embedding. In: Lang, J. (ed.) Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI 2018, July 13–19, 2018, Stockholm, Sweden. pp. 2609–2615. ijcai.org (2018)

    Google Scholar 

  22. Park, J., Lee, M., Chang, H.J., Lee, K., Choi, J.Y.: Symmetric graph convolutional autoencoder for unsupervised graph representation learning. In: 2019 IEEE/CVF International Conference on Computer Vision, ICCV 2019, Seoul, Korea (South), 27 October–2 November 2019, pp. 6518–6527. IEEE (2019)

    Google Scholar 

  23. Paszke, A., et al.: Pytorch: an imperative style, high-performance deep learning library. In: Advances in Neural Information Processing Systems, vol. 32 (2019)

    Google Scholar 

  24. Peng, Z., et al.: Graph representation learning via graphical mutual information maximization. In: Huang, Y., King, I., Liu, T., van Steen, M. (eds.) WWW 2020: The Web Conference 2020, Taipei, Taiwan, 20–24 April 2020, pp. 259–270. ACM/IW3C2 (2020)

    Google Scholar 

  25. Perozzi, B., Al-Rfou, R., Skiena, S.: Deepwalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2014, pp. 701–710. Association for Computing Machinery, New York (2014). https://doi.org/10.1145/2623330.2623732

  26. Sato, R.: A survey on the expressive power of graph neural networks. CoRR abs/2003.04078 (2020)

    Google Scholar 

  27. Sen, P., Namata, G., Bilgic, M., Getoor, L., Galligher, B., Eliassi-Rad, T.: Collective classification in network data. AI Mag. 29(3), 93–93 (2008)

    Google Scholar 

  28. Shchur, O., Mumme, M., Bojchevski, A., Günnemann, S.: Pitfalls of graph neural network evaluation. CoRR abs/1811.05868 (2018)

    Google Scholar 

  29. Thakoor, S., Tallec, C., Azar, M.G., Munos, R., Velickovic, P., Valko, M.: Bootstrapped representation learning on graphs. CoRR abs/2102.06514 (2021)

    Google Scholar 

  30. Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. Stat 1050, 4 (2018)

    Google Scholar 

  31. Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. In: ICLR (Poster), vol. 2, no. 3, p. 4 (2019)

    Google Scholar 

  32. Vishwanathan, S.V.N., Schraudolph, N.N., Kondor, R., Borgwardt, K.M.: Graph kernels. J. Mach. Learn. Res. 11, 1201–1242 (2010)

    MathSciNet  MATH  Google Scholar 

  33. Von Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416 (2007)

    Article  MathSciNet  Google Scholar 

  34. Wu, F., Souza, A., Zhang, T., Fifty, C., Yu, T., Weinberger, K.: Simplifying graph convolutional networks. In: Chaudhuri, K., Salakhutdinov, R. (eds.) Proceedings of the 36th International Conference on Machine Learning. Proceedings of Machine Learning Research, Long Beach, California, USA, vol. 97, pp. 6861–6871. PMLR (2019)

    Google Scholar 

  35. Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., Yu, P.S.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32(1), 4–24 (2021). https://doi.org/10.1109/tnnls.2020.2978386

    Article  MathSciNet  Google Scholar 

  36. Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? In: 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, 6–9 May 2019. OpenReview.net (2019)

    Google Scholar 

  37. Yang, Z., Cohen, W.W., Salakhutdinov, R.: Revisiting semi-supervised learning with graph embeddings. In: Balcan, M., Weinberger, K.Q. (eds.) Proceedings of the 33nd International Conference on Machine Learning, ICML 2016, New York City, NY, USA, 19–24 June 2016. JMLR Workshop and Conference Proceedings, vol. 48, pp. 40–48. JMLR.org (2016)

    Google Scholar 

  38. Zhao, T., Liu, Y., Neves, L., Woodford, O., Jiang, M., Shah, N.: Data augmentation for graph neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 12, pp. 11015–11023 (2021)

    Google Scholar 

  39. Zheng, S., Zhu, Z., Zhang, X., Liu, Z., Cheng, J., Zhao, Y.: Distribution-induced bidirectional generative adversarial network for graph representation learning. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020, Seattle, WA, USA, 13–19 June 2020, pp. 7222–7231. Computer Vision Foundation/IEEE (2020)

    Google Scholar 

  40. Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Deep graph contrastive representation learning. CoRR abs/2006.04131 (2020)

    Google Scholar 

  41. Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Graph contrastive learning with adaptive augmentation. In: Leskovec, J., Grobelnik, M., Najork, M., Tang, J., Zia, L. (eds.) WWW 2021: The Web Conference 2021, Virtual Event/Ljubljana, Slovenia, 19–23 April 2021, pp. 2069–2080. ACM/IW3C2 (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yang-Geng Fu .

Editor information

Editors and Affiliations

Ethics declarations

Ethics Statement

We believe in using machine learning responsibly and ethically and in minimizing any potential harm associated with its use. We will strive to ensure the accuracy and reliability of our models. We will always respect applicable laws, regulations, and best practices and will make sure our models are used ethically and responsibly.

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, J. et al. (2023). Graph Contrastive Representation Learning with Input-Aware and Cluster-Aware Regularization. In: Koutra, D., Plant, C., Gomez Rodriguez, M., Baralis, E., Bonchi, F. (eds) Machine Learning and Knowledge Discovery in Databases: Research Track. ECML PKDD 2023. Lecture Notes in Computer Science(), vol 14170. Springer, Cham. https://doi.org/10.1007/978-3-031-43415-0_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43415-0_39

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43414-3

  • Online ISBN: 978-3-031-43415-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics