Skip to main content

RotatSAGE: A Scalable Knowledge Graph Embedding Model Based on Translation Assumptions and Graph Neural Networks

  • Conference paper
  • First Online:
Data Mining and Big Data (DMBD 2022)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1744))

Included in the following conference series:

  • 541 Accesses

Abstract

Knowledge graphs have been widely used in numerous AI applications. In this paper, we propose an efficient knowledge graph embedding model called RotatSAGE by combining the RotatE model and the GraphSAGE model. In the proposed model the RotatE model is used to learn the embedding vectors of heterogeneous entities and relations in a knowledge graph. One problem of the RotatE model is that it only can learn from a single triplet and cannot take advantage of local information to learn embeddings. To solve this issue, we introduce the GraphSAGE model into RotatE. The GraphSAGE model can use neighbor information to improve the embedding of an entity by sampling a small and fixed number of neighbors. We also propose a sampling strategy to further eliminate redundant entity information and simplify the proposed model. In the experiments, the link prediction task is used to evaluate the performance of embedding models. The experiments on four benchmark datasets show the overall performance of RotatSAGE is higher than baseline models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Miller, G.A.: WordNet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)

    Article  Google Scholar 

  2. Fabian, M., Suchanek, K.G., Weikum, G.: YAGO: A core of semantic knowledge unifying WordNet and Wikipedia. In: Proceedings of the 16th International Conference on World Wide Web (2007)

    Google Scholar 

  3. Bollacker, K., et al.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data (2008)

    Google Scholar 

  4. Wang, Q., Mao, Z., Wang, B., Guo, L.: Knowledge graph embedding: a survey of approaches and applications. IEEE Trans. Knowl. Data Eng. 29(12), 2724–2743 (2017). https://doi.org/10.1109/TKDE.2017.2754499

  5. Bordes, A., et al.: Translating embeddings for modeling multi-relational data. In: Neural Information Processing Systems (NIPS) (2013)

    Google Scholar 

  6. Wang, Z., et al.: Knowledge graph embedding by translating on hyperplanes. Proc. AAAI Conf. Artif. Intell. 28(1) (2014)

    Google Scholar 

  7. Lin, Y., et al.: Learning entity and relation embeddings for knowledge graph completion. Proc. AAAI Conf. Artif. Intell. 29(1) (2015)

    Google Scholar 

  8. Ji, G., et al.: Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (volume 1: Long papers) (2015)

    Google Scholar 

  9. Ji, G., et al.: Knowledge graph completion with adaptive sparse transfer matrix. Proc. AAAI Conf. Artif. Intell. 30(1) (2016)

    Google Scholar 

  10. Sun, Z., et al.: Rotate: Knowledge graph embedding by relational rotation in complex space. arXiv preprint arXiv:1902.10197 (2019)

  11. Gori, M., Monfardini, G., Scarselli, F.: A new model for learning in graph domains. In: IEEE International Joint Conference on Neural Networks, pp. 729–734 (2005)

    Google Scholar 

  12. Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2009)

    Google Scholar 

  13. Li, Y., Tarlow, D., Brockschmidt, M., Zemel, R.: Gated graph sequence neural networks. In: International Conference on Learning Representations (ICLR) (2016)

    Google Scholar 

  14. LeCun, Y., et al.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)

    Google Scholar 

  15. Wu, Z., et al.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32, 4–24 (2020)

    Google Scholar 

  16. Kipf, TN., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  17. Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. arXiv preprint arXiv:1706.02216 (2017)

  18. Wang, X., et al.: Heterogeneous graph attention network. In: The World Wide Web Conference (2019)

    Google Scholar 

  19. Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., Navigli, R., Vidal, M.-E., Hitzler, P., Troncy, R., Hollink, L. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38

    Chapter  Google Scholar 

  20. Yang, B., et al.: Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:1412.6575 (2014)

  21. Cai, L., Wang, Y.W.: Kbgan: adversarial learning for knowledge graph embeddings. arXiv preprint arXiv:1711.04071 (2017)

  22. Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd Workshop on Continuous Vector Space Models and Their Compositionality (2015)

    Google Scholar 

  23. Dettmers, T., et al.: Convolutional 2d knowledge graph embeddings. Proc. AAAI Conf. Artif. Intell. 32(1) (2018)

    Google Scholar 

  24. Kingma, D.P., Ba, J.B.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  25. Zhang, Y., et al.: AutoSF: searching scoring functions for knowledge graph embedding. In: 2020 IEEE 36th International Conference on Data Engineering (ICDE). IEEE (2020)

    Google Scholar 

  26. Wang, R., et al.: Knowledge graph embedding via graph attenuated attention networks. IEEE Access 8, 5212–5224 (2019)

    Google Scholar 

  27. Wang, B., et al.: Structure-augmented text representation learning for efficient knowledge graph completion. In: International World Wide Web Conference (2021)

    Google Scholar 

  28. Li, Z., et al.: Learning knowledge graph embedding with heterogeneous relation attention networks. IEEE Trans. Neural Netw. Learn. Syst. 33, 3961–3973 (2021)

    Google Scholar 

  29. Zhang, Z., et al.: Multi-scale dynamic convolutional network for knowledge graph embedding. IEEE Trans. Knowl. Data Eng. 34, 2335–2347 (2020)

    Google Scholar 

Download references

Acknowledgement

This work was supported by the National Natural Science Foundation of China (Grant No. 61872107).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuxin Ding .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ma, Y., Ding, Y., Wang, G. (2022). RotatSAGE: A Scalable Knowledge Graph Embedding Model Based on Translation Assumptions and Graph Neural Networks. In: Tan, Y., Shi, Y. (eds) Data Mining and Big Data. DMBD 2022. Communications in Computer and Information Science, vol 1744. Springer, Singapore. https://doi.org/10.1007/978-981-19-9297-1_8

Download citation

  • DOI: https://doi.org/10.1007/978-981-19-9297-1_8

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-19-9296-4

  • Online ISBN: 978-981-19-9297-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics