Abstract
Recent studies on self-supervised learning with graph-based recommendation models have achieved outstanding performance. They usually introduce auxiliary learning tasks that maximize the mutual information between representations of the original graph and its augmented views. However, most of these models adopt random dropout to construct the additional graph view, failing to differentiate the importance of edges. The insufficiency of these methods in capturing structural properties of the user-item interaction graph leads to suboptimal recommendation performance. In this paper, we propose a Graph Diffusion Contrastive Learning (GDCL) framework for recommendation to close this gap. Specifically, we perform graph diffusion on the user-item interaction graph. Then, the diffusion graph is encoded to preserve its heterogeneity by learning a dedicated representation for every type of relations. A symmetric contrastive learning objective is used to contrast local node representations of the diffusion graph with those of the user-item interaction graph for learning better user and item representations. Extensive experiments on real datasets demonstrate that GDCL consistently outperforms state-of-the-art recommendation methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Andersen, R., Chung, F., Lang, K.: Local graph partitioning using PageRank vectors. In: FOCS 2006, pp. 475–486 (2006)
Bojchevski, A., et al.: Scaling graph neural networks with approximate PageRank. In: KDD 2020, pp. 2464–2473 (2020)
Cao, J., Lin, X., Guo, S., Liu, L., Liu, T., Wang, B.: Bipartite graph embedding via mutual information maximization. In: WSDM 2021, pp. 635–643 (2021)
Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: AISTATS 2010, pp. 249–256 (2010)
Han, X., Shi, C., Wang, S., Yu, P.S., Song, L.: Aspect-level deep collaborative filtering via heterogeneous information networks. In: IJCAI 2018, pp. 3393–3399 (2018)
Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: ICML 2020, pp. 4116–4126 (2020)
He, X., Deng, K., Wang, X., Li, Y., Zhang, Y., Wang, M.: LightGCN: simplifying and powering graph convolution network for recommendation. In: SIGIR 2020 (2020)
Jiang, Z., Liu, H., Fu, B., Wu, Z., Zhang, T.: Recommendation in heterogeneous information networks based on generalized random walk model and Bayesian personalized ranking. In: WSDM 2018, pp. 288–296 (2018)
Jin, W., et al.: Self-supervised learning on graphs: deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Klicpera, J., Bojchevski, A., Günnemann, S.: Predict then propagate: graph neural networks meet personalized PageRank. arXiv:1810.05997 (2018)
Klicpera, J., Weißenberger, S., Günnemann, S.: Diffusion improves graph learning. In: NeurIPS 2019 (2019)
Lee, D., Kang, S., Ju, H., Park, C., Yu, H.: Bootstrapping user and item representations for one-class collaborative filtering. In: SIGIR 2021, pp. 317–326 (2021)
Lei, C., et al.: SEMI: a sequential multi-modal information transfer network for e-commerce micro-video recommendations. In: KDD 2021, pp. 3161–3171 (2021)
Liu, Y., et al.: Pre-training graph transformer with multimodal side information for recommendation. In: MM 2021, pp. 2853–2861 (2021)
Liu, Y., Yang, S., Xu, Y., Miao, C., Wu, M., Zhang, J.: Contextualized graph attention network for recommendation with item knowledge graph. TKDE (2021)
Liu, Y., Yang, S., Zhang, Y., Miao, C., Nie, Z., Zhang, J.: Learning hierarchical review graph representations for recommendation. TKDE (2021)
Liu, Z., Ma, Y., Ouyang, Y., Xiong, Z.: Contrastive learning for recommender system. arXiv:2101.01317 (2021)
Ni, J., Li, J., McAuley, J.: Justifying recommendations using distantly-labeled reviews and fine-grained aspects. In: EMNLP-IJCNLP 2019, pp. 188–197 (2019)
Page, L., Brin, S., Motwani, R., Winograd, T.: The PageRank citation ranking: Bringing order to the web. Technical report 1999-66 (1999)
Paszke, A., Gross, S., Massa, F., et al.: PyTorch: an imperative style, high-performance deep learning library. In: Advances in Neural Information Processing Systems, vol. 32, pp. 8024–8035 (2019)
Rendle, S., Freudenthaler, C., Gantner, Z., Schmidt-Thieme, L.: BPR: Bayesian personalized ranking from implicit feedback. arXiv:1205.2618 (2012)
Kondor, R.I., Lafferty, J.D.: Diffusion kernels on graphs and other discrete input spaces. In: ICML 2002 (2002)
Tang, H., Zhao, G., Wu, Y., Qian, X.: Multi-sample based contrastive loss for top-k recommendation. arXiv:2109.00217 (2021)
Wang, X., He, X., Cao, Y., Liu, M., Chua, T.S.: KGAT: knowledge graph attention network for recommendation. In: KDD 2019, pp. 950–958 (2019)
Wu, J., et al.: Self-supervised graph learning for recommendation. arXiv:2010.10783 (2021)
You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. In: NeurIPS 2020, vol. 33, pp. 5812–5823 (2020)
Zhang, Y., Li, B., Liu, Y., Miao, C.: Initialization matters: regularizing manifold-informed initialization for neural recommendation systems. In: KDD 2021, pp. 2263–2273 (2021)
Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Graph contrastive learning with adaptive augmentation. In: WWW 2021, pp. 2069–2080 (2021)
Acknowledgments
This research is supported by Alibaba Group through Alibaba Innovative Research (AIR) Program and Alibaba-NTU Singapore Joint Research Institute (JRI), Nanyang Technological University, Singapore.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, L., Liu, Y., Zhou, X., Miao, C., Wang, G., Tang, H. (2022). Diffusion-Based Graph Contrastive Learning for Recommendation with Implicit Feedback. In: Bhattacharya, A., et al. Database Systems for Advanced Applications. DASFAA 2022. Lecture Notes in Computer Science, vol 13246. Springer, Cham. https://doi.org/10.1007/978-3-031-00126-0_15
Download citation
DOI: https://doi.org/10.1007/978-3-031-00126-0_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-00125-3
Online ISBN: 978-3-031-00126-0
eBook Packages: Computer ScienceComputer Science (R0)