Skip to main content
Log in

TCKGE: Transformers with contrastive learning for knowledge graph embedding

  • Regular Paper
  • Published:
International Journal of Multimedia Information Retrieval Aims and scope Submit manuscript

Abstract

Representation learning of knowledge graphs has emerged as a powerful technique for various downstream tasks. In recent years, numerous research efforts have been made for knowledge graphs embedding. However, previous approaches usually have difficulty dealing with complex multi-relational knowledge graphs due to their shallow network architecture. In this paper, we propose a novel framework named Transformers with Contrastive learning for Knowledge Graph Embedding (TCKGE), which aims to learn complex semantics in multi-relational knowledge graphs with deep architectures. To effectively capture the rich semantics of knowledge graphs, our framework leverages the powerful Transformers to build a deep hierarchical architecture to dynamically learn the embeddings of entities and relations. To obtain more robust knowledge embeddings with our deep architecture, we design a contrastive learning scheme to facilitate optimization by exploring the effectiveness of several different data augmentation strategies. The experimental results on two benchmark datasets show the superior of TCKGE over state-of-the-art models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availability

The FB15K-237 dataset is included in this published article [51]. The WN18RR dataset is included in [25].

References

  1. Lehmann J et al (2015) Dbpedia - a large-scale, multilingual knowledge base extracted from Wikipedia. Sem Web 6(2):167–195. https://doi.org/10.3233/SW-140134

    Article  Google Scholar 

  2. Bollacker KD, Evans C, Paritosh PK, Sturge T, Taylor J (2008) Freebase: a collaboratively created graph database for structuring human knowledge, pp 1247–1250 (ACM)

  3. Suchanek FM, Kasneci G, Weikum G (2007) Yago: a core of semantic knowledge, pp 697–706 (ACM)

  4. IV, RL L, Liu NF, Peters ME, Gardner M, Singh S (2019) Barack’s wife hillary: using knowledge graphs for fact-aware language modeling, pp 5962–5971 (association for computational linguistics)

  5. Zhang Z et al (2019) ERNIE: enhanced language representation with informative entities, pp 1441–1451 (association for computational linguistics)

  6. Hayashi H, Hu Z, Xiong C, Neubig G (2020) Latent relation language models, AAAI Press, pp 7911–7918

  7. Chaudhary C, Goyal P, Prasad DN, Chen YP (2020) Enhancing the quality of image tagging using a visio-textual knowledge base. IEEE Trans Multim 22(4):897–911. https://doi.org/10.1109/TMM.2019.2937181

    Article  Google Scholar 

  8. Xue F et al (2020) Knowledge-based topic model for multi-modal social event analysis. IEEE Trans Multim 22(8):2098–2110. https://doi.org/10.1109/TMM.2019.2951194

    Article  Google Scholar 

  9. Zhang X et al (2015) Enhancing video event recognition using automatically constructed semantic-visual knowledge base. IEEE Trans Multim 17(9):1562–1575. https://doi.org/10.1109/TMM.2015.2449660

    Article  Google Scholar 

  10. Riedel S, Yao L, McCallum A, Marlin BM (2013) Relation extraction with matrix factorization and universal schemas, pp 74–84 (the association for computational linguistics)

  11. Xiong W, Hoang T, Wang WY (2017) Deeppath: a reinforcement learning method for knowledge graph reasoning, pp 564–573 (association for computational linguistics)

  12. Verga P, Sun H, Soares LB, Cohen WW (2020) Facts as experts: adaptable and interpretable neural memory over symbolic knowledge. CoRR . arXiv:2007.00849

  13. Nickel M, Murphy K, Tresp V, Gabrilovich E (2016) A review of relational machine learning for knowledge graphs. Proc IEEE 104(1):11–33. https://doi.org/10.1109/JPROC.2015.2483592

    Article  Google Scholar 

  14. Wang Q, Mao Z, Wang B, Guo L (2017) Knowledge graph embedding: a survey of approaches and applications. IEEE Trans Knowl Data Eng 29(12):2724–2743. https://doi.org/10.1109/TKDE.2017.2754499

    Article  Google Scholar 

  15. Weston J, Bordes A, Yakhnenko O, Usunier N (2013) Connecting language and knowledge bases with embedding models for relation extraction, pp 1366–1371 (ACL). https://aclanthology.org/D13-1136/

  16. Bordes A, Chopra S, Weston J (2014) Question answering with subgraph embeddings, pp 615–620 (ACL)

  17. Reinanda R, Meij E, de Rijke M (2020) Knowledge graphs: an information retrieval perspective. Found Trends Inf Retr 14(4):289–444. https://doi.org/10.1561/1500000063

    Article  Google Scholar 

  18. Guo Q et al (2022) A survey on knowledge graph-based recommender systems. IEEE Trans Knowl Data Eng 34(8):3549–3568. https://doi.org/10.1109/TKDE.2020.3028705

    Article  Google Scholar 

  19. Ji S, Pan S, Cambria E, Marttinen P, Yu PS (2022) A survey on knowledge graphs: representation, acquisition, and applications. IEEE Trans Neural Netw Learn Syst 33(2):494–514. https://doi.org/10.1109/TNNLS.2021.3070843

    Article  MathSciNet  Google Scholar 

  20. Bordes A, Usunier N, García-Durán A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data, pp 2787–2795

  21. Yang B, Yih W, He X, Gao J, Deng L (2015) Embedding entities and relations for learning and inference in knowledge bases

  22. Sun Z, Deng Z, Nie J, Tang J (2019) Rotate: knowledge graph embedding by relational rotation in complex space (OpenReview.net)

  23. Schlichtkrull MS (2018) et al. Modeling relational data with graph convolutional networks, vol 10843. Springer, pp 593–607

  24. Vashishth S, Sanyal S, Nitin V, Talukdar PP (2020) Composition-based multi-relational graph convolutional networks (OpenReview.net)

  25. Dettmers T, Minervini P, Stenetorp P, Riedel S (2018) Convolutional 2d knowledge graph embeddings. AAAI Press, pp 1811–1818

  26. Chen T, Kornblith S, Norouzi M, Hinton GE (2020) A simple framework for contrastive learning of visual representations, vol 119, pp 1597–1607 (PMLR)

  27. Yan Y et al (2021) Consert: a contrastive framework for self-supervised sentence representation transfer, pp 5065–5075 (association for computational linguistics)

  28. You Y et al (2020) Graph contrastive learning with augmentations

  29. Wang Q et al (2019) Coke: contextualized knowledge graph embedding. CoRR . arXiv:1911.02168

  30. Chen S et al (2021) Hitter: Hierarchical transformers for knowledge graph embeddings, pp 10395–10407 (association for computational linguistics)

  31. Wang Z, Zhang J, Feng J, Chen Z (2014) Knowledge graph embedding by translating on hyperplanes. AAAI Press, pp 1112–1119

  32. Lin Y, Liu Z, Sun M, Liu Y, Zhu X (2015) Learning entity and relation embeddings for knowledge graph completion. AAAI Press, pp 2181–2187

  33. Ji G, He S, Xu L, Liu K, Zhao J (2015) Knowledge graph embedding via dynamic mapping matrix, pp 687–696 (the association for computer linguistics)

  34. Jia Y, Wang Y, Lin H, Jin X, Cheng X (2016) Locally adaptive translation for knowledge graph embedding. AAAI Press, pp 992–998

  35. Xiao H, Huang M, Hao Y, Zhu X (2015) Transg : a generative mixture model for knowledge graph embedding. CoRR. arXiv:1509.05488

  36. Trouillon T, Welbl J, Riedel S, Gaussier É, Bouchard G (2016) Complex embeddings for simple link prediction, vol 48, pp 2071–2080 (JMLR.org)

  37. Liu H, Wu Y, Yang Y (2017) Analogical inference for multi-relational embeddings, vol 70, pp 2168–2178 (PMLR)

  38. Kazemi SM, Poole D (2018) Simple embedding for link prediction in knowledge graphs, pp 4289–4300

  39. Balazevic I, Allen C, Hospedales TM (2019) Tucker: tensor factorization for knowledge graph completion, pp 5184–5193 (association for computational linguistics)

  40. Chami I et al (2020) Low-dimensional hyperbolic knowledge graph embeddings, pp 6901–6914 (association for computational linguistics)

  41. Nickel M, Tresp V, Kriegel H (2011) A three-way model for collective learning on multi-relational data, pp 809–816 (Omnipress)

  42. Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks (OpenReview.net)

  43. Zhu Y et al (2021) Graph contrastive learning with adaptive augmentation, pp 2069–2080 (ACM/IW3C2)

  44. Hassani K, Ahmadi A HK (2020) Contrastive multi-view representation learning on graphs, vol 119, pp 4116–4126 (PMLR)

  45. Zhang R, Lu C, Jiao Z, Li X (2021) Deep contrastive graph representation via adaptive homotopy learning. CoRR. arXiv:2106.09244

  46. Velickovic P et al (2018) Deep graph infomax. CoRR. arXiv:1809.10341

  47. Sun F, Hoffmann J, Verma V, Tang J (2020) Infograph: unsupervised and semi-supervised graph-level representation learning via mutual information maximization (OpenReview.net)

  48. Wan S, Pan S, Yang J, Gong C (2021) Contrastive and generative graph convolutional networks for graph-based semi-supervised learning, AAAI Press, pp 10049–10057

  49. Devlin J, Chang M, Lee K, Toutanova K (2019) BERT: pre-training of deep bidirectional transformers for language understanding, pp 4171–4186 (association for computational linguistics)

  50. van den Oord A, Li Y, Vinyals O (2018) Representation learning with contrastive predictive coding. CoRR. arXiv:1807.03748

  51. Toutanova K, Chen D (2015) Observed versus latent features for knowledge base and text inference. Association for Computational Linguistics, Beijing, China, pp 57–66

  52. Miller GA (1995) Wordnet: a lexical database for English. Commun ACM 38(11):39–41. https://doi.org/10.1145/219717.219748

    Article  Google Scholar 

  53. Broscheit S, Ruffinelli D, Kochsiek A, Betz P, Gemulla R (2020) Libkge - a knowledge graph embedding library for reproducible research, pp 165–174 (association for computational linguistics)

  54. Vaswani A et al (2017) Attention is all you need, pp 5998–6008

  55. Kingma DP, Ba J (2015) A method for stochastic optimization, Adam

Download references

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China under Grants 62036012, 62072456, 62106262. This work is supported by Open Research Projects of Zhejiang Lab (NO. 2021KE0AB05)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Quan Fang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, X., Fang, Q., Hu, J. et al. TCKGE: Transformers with contrastive learning for knowledge graph embedding. Int J Multimed Info Retr 11, 589–597 (2022). https://doi.org/10.1007/s13735-022-00256-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13735-022-00256-3

Keywords

Navigation