Skip to main content
Log in

A structure distinguishable graph attention network for knowledge base completion

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

A knowledge graph is a collection of triples, often represented in the form of “subject,” “relation,” “object.” The task of knowledge graph completion (KGC) is to automatically predict missing links by reasoning over the information already present in the knowledge graph. Recent popularization of graph neural networks has also been spread to KGC. Typical techniques like SACN achieve dramatic achievements and beat previous state-of-the-art. However, those models still lack the ability to distinguish different local structures within a graph, which leads to the over smoothing problem. In this work, we propose SD-GAT, a graph attention network with a structure-distinguishable neighborhood aggregation scheme, which models the injective function to aggregate information from the neighborhood. The model is constituted of two modules. The encoder is a graph attention network that improved with our neighborhood aggregation scheme, which could be applied for a more distinct representation of entities and relations. The decoder is a convolutional neural network using \(3\times 3\) convolution filters. Our empirical research provides an effective solution to increase the discriminative power of graph attention networks, and we show significant improvement of the proposed SD-GAT compared to the state-of-the-art methods on standard FB15K-237 and WN18RR datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Availability of data and material

All data used during this study are available in the https://github.com/thunlp/OpenKE/tree/master/benchmarks.

References

  1. Suchanek FM, Kasneci G, Weikum G (2007) Yago: a core of semantic knowledge. In: Proceedings of the 16th international conference on World Wide Web, pp 697–706, https://doi.org/10.1145/1242572.1242667

  2. Carlson A, Betteridge J, Kisiel B, Settles B, Hruschka E, Mitchell T (2010) Toward an architecture for never-ending language learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 24

  3. Vrandečić D, Krötzsch M (2014) Wikidata: a free collaborative knowledgebase. Comm ACM 57(10):78–85. https://doi.org/10.1145/2629489

    Article  Google Scholar 

  4. Choi E, Kwiatkowski T, Zettlemoyer L (2015) Scalable semantic parsing with partial ontologies. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp 1311–1320, https://doi.org/10.3115/v1/P15-1127

  5. Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. In: Neural Information Processing Systems (NIPS), pp 1–9

  6. Wang Z, Zhang J, Feng J, Chen Z (2014) Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 28

  7. Lin Y, Liu Z, Sun M, Liu Y, Zhu X (2015) Learning entity and relation embeddings for knowledge graph completion. In: Proceedings of the AAAI Conference on Artificial Intelligence, 29

  8. Trouillon T, Dance CR, Welbl J, Riedel S, Gaussier É, Bouchard G (2017) Knowledge graph completion via complex tensor factorization. arXiv preprint arXiv:170206879

  9. Balažević I, Allen C, Hospedales TM (2019) Tucker: Tensor factorization for knowledge graph completion. arXiv preprint arXiv:190109590 https://doi.org/10.18653/v1/D19-1522

  10. Nguyen DQ, Nguyen TD, Nguyen DQ, Phung D (2017) A novel embedding model for knowledge base completion based on convolutional neural network. arXiv preprint arXiv:171202121 https://doi.org/10.18653/v1/N18-2053

  11. Dettmers T, Minervini P, Stenetorp P, Riedel S (2018) Convolutional 2d knowledge graph embeddings. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 32

  12. Vu T, Nguyen TD, Nguyen DQ, Phung D, et al. (2019) A capsule network-based embedding model for knowledge graph completion and search personalization. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp 2180–2189, https://doi.org/10.18653/v1/N19-1226

  13. Xu K, Li C, Tian Y, Sonobe T, Kawarabayashi Ki, Jegelka S (2018b) Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, PMLR, pp 5453–5462

  14. Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: International Conference on Machine Learning, PMLR, pp 1263–1272

  15. Schlichtkrull M, Kipf TN, Bloem P, Van Den Berg R, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. In: European semantic web conference, Springer, pp 593–607

  16. Shang C, Tang Y, Huang J, Bi J, He X, Zhou B (2019) End-to-end structure-aware convolutional networks for knowledge base completion. Proceedings of the AAAI Conference on Artificial Intelligence 33: 3060–3067. https://doi.org/10.1609/AAAI.V33I01.33013060

  17. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. arXiv preprint arXiv:171010903

  18. Thekumparampil KK, Wang C, Oh S, Li LJ (2018) Attention-based graph neural network for semi-supervised learning. arXiv preprint arXiv:180303735

  19. Lee JB, Rossi R, Kong X (2018) Graph classification using structural attention. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp 1666–1674

  20. Ji Y, Zhang H, Jie Z, Ma L, Wu QJ (2020) Casnet: a cross-attention siamese network for video salient object detection. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2020.3007534

    Article  Google Scholar 

  21. Xu K, Hu W, Leskovec J, Jegelka S (2018a) How powerful are graph neural networks? arXiv preprint arXiv:181000826

  22. Weisfeiler B, Leman A (1968) The reduction of a graph to canonical form and the algebra which appears therein. NTI, Series 2(9):12–16

    Google Scholar 

  23. Yang B, Yih Wt, He X, Gao J, Deng L (2014) Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:14126575

  24. Zhang Z, Cai J, Zhang Y, Wang J (2020) Learning hierarchy-aware knowledge graph embeddings for link prediction. Proceed AAAI Conf Artif Intell 34:3065–3072

    Google Scholar 

  25. Nickel M, Tresp V, Kriegel HP (2011) A three-way model for collective learning on multi-relational data. In: Icml

  26. Zhang Z, Cai J, Wang J (2020a) Duality-induced regularizer for tensor factorization based knowledge graph completion. arXiv preprint arXiv:201105816

  27. Sabour S, Frosst N, Hinton GE (2017) Dynamic routing between capsules. arXiv preprint arXiv:171009829

  28. Liang Y, Cai Z, Yu J, Han Q, Li Y (2018) Deep learning based inference of private information using embedded sensors in smart devices. IEEE Netw 32(4):8–14. https://doi.org/10.1109/MNET.2018.1700349

    Article  Google Scholar 

  29. Li K, Lu G, Luo G, Cai Z (2020) Seed-free graph de-anonymiztiation with adversarial learning. In: Proceedings of the 29th ACM International Conference on Information & Knowledge Management, pp 745–754, https://doi.org/10.1145/3340531.3411970

  30. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:160902907

  31. Nathani D, Chauhan J, Sharma C, Kaul M (2019) Learning attention-based embeddings for relation prediction in knowledge graphs. arXiv preprint arXiv:190601195 https://doi.org/10.18653/v1/P19-1466

  32. Xu X, Feng W, Jiang Y, Xie X, Sun Z, Deng ZH (2019) Dynamically pruned message passing networks for large-scale knowledge graph reasoning. arXiv preprint arXiv:190911334

  33. Bansal T, Juan DC, Ravi S, McCallum A (2019) A2n: Attending to neighbors for knowledge graph inference. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp 4387–4392, https://doi.org/10.18653/v1/P19-14316

  34. Hornik K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2(5):359–366. https://doi.org/10.1016/0893-6080(89)90020-8

    Article  MATH  Google Scholar 

  35. Hornik K (1991) Approximation capabilities of multilayer feedforward networks. Neural Netw 4(2):251–257. https://doi.org/10.1016/0893-6080(91)90009-T

    Article  MathSciNet  Google Scholar 

  36. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. arXiv preprint arXiv:170603762

  37. Kingma DP, Ba J (2014) Adam: A method for stochastic optimization. arXiv preprint arXiv:14126980

  38. Toutanova K, Chen D, Pantel P, Poon H, Choudhury P, Gamon M (2015) Representing text for joint embedding of text and knowledge bases. In: Proceedings of the 2015 conference on empirical methods in natural language processing, pp 1499–1509, https://doi.org/10.18653/v1/D15-1174

  39. Cai Z, Zheng X (2018) A private and efficient mechanism for data uploading in smart cyber-physical systems. IEEE Trans Netw Sci Eng 7(2):766–775. https://doi.org/10.1109/TNSE.2018.2830307

    Article  MathSciNet  Google Scholar 

  40. Cheng S, Cai Z, Li J, Gao H (2017) Extracting kernel dataset from big sensory data in wireless sensor networks. IEEE Trans Knowl Data Eng 29(4):813–827. https://doi.org/10.1109/TKDE.2016.2645212

    Article  Google Scholar 

  41. Wu Y, Zhang X, Bian Y, Cai Z, Lian X, Liao X, Zhao F (2018) Second-order random walk-based proximity measures in graph analysis: formulations and algorithms. VLDB J 27(1):127–152. https://doi.org/10.1007/s00778-017-0490-5

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Key R&D Program of China (No. 2018YFC0807500), by National Natural Science Foundation of China (No. U19A2059), and by Ministry of Science and Technology of Sichuan Province Program (No. 2018GZDZX0048,20ZDYF0343).

Funding

This work was supported by the National Key \(R \& D\) Program of China (No. 2018YFC0807500), by National Natural Science Foundation of China (No.U19A2059), and by Ministry of Science and Technology of Sichuan Province Program (No.2018GZDZX0048, 20ZDYF0343).

Author information

Authors and Affiliations

Authors

Contributions

Xue Zhou and Bei Hui conceived and designed the study. Xue Zhou and Kexi Ji performed the experiments. Xue Zhou wrote the paper. Bei Hui and Lizong Zhang reviewed and edited the manuscript. All authors read and approved the manuscript.

Corresponding author

Correspondence to Lizong Zhang.

Ethics declarations

Conflicts of interest/Competing interests

The authors declare that they have no competing interests.

Code availability

All data, models, and code generated or used during the study appear in https://github.com/ooCher/SD-GAT.

Conflict of interest

The authors declare that they do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, X., Hui, B., Zhang, L. et al. A structure distinguishable graph attention network for knowledge base completion. Neural Comput & Applic 33, 16005–16017 (2021). https://doi.org/10.1007/s00521-021-06221-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06221-1

Keywords

Navigation