Skip to main content
Log in

Learning knowledge graph embedding with a bi-directional relation encoding network and a convolutional autoencoder decoding network

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Derived from knowledge bases, knowledge graphs represent knowledge expressions in graphs, which utilize nodes and edges to denote entities and relations conceptually. Knowledge graph can be described in textual triple form, consisting of head entities, tail entities and relations between entities. In order to represent elements in knowledge graphs, knowledge graph embedding techniques are proposed to map entities and relations into continuous vector spaces as numeric vectors for computational efficiency. Convolution-based knowledge graph embedding models have promising performance for knowledge graph representation learning. However, the input of those neural network-based models is frequently in handmade forms and may suffer from low efficiency in feature extraction procedure of the models. In this paper, a convolutional autoencoder is proposed for knowledge graph representation learning with entity pairs as input, aiming to obtain corresponding hidden relation representation. In addition, a bi-directional relation encoding network is utilized to represent semantic of entities in different directional relation patterns, as an encoder to output representation for initialization of the convolutional autoencoder. Experiments are conducted on standard datasets including, WN18RR, Kinship, NELL-995 and FB15k-237 as a link prediction task. Besides, input embedding matrix composed of different ingredients is designed to evaluate performances of the convolutional autoencoder. The results demonstrate that our model is effective in learning representation from entity feature interactions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J (2008) Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD international conference on Management of data, pp 1247–1250

  2. Miller GA (1995) Wordnet: a lexical database for english. Commun ACM 38(11):39–41

    Article  Google Scholar 

  3. Suchanek FM, Kasneci G, Weikum G (2007) Yago: a core of semantic knowledge. In: Proceedings of the 16th international conference on World Wide Web, pp 697–706

  4. Wang Q, Mao Z, Wang B, Guo L (2017) Knowledge graph embedding: a survey of approaches and applications. IEEE Trans Knowl Data Eng 29(12):2724–2743

    Article  Google Scholar 

  5. Wang G, Zhang W, Wang R, Zhou Y, Chen X, Zhang W, Zhu H, Chen H (2018) Label-free distant supervision for relation extraction via knowledge graph embedding. In: Proceedings of the 2018 conference on empirical methods in natural language processing, pp 2246–2255

  6. Zhang Y, Dai H, Kozareva Z, Smola AJ, Song L (2018) Variational reasoning for question answering with knowledge graph. In: Thirty-second AAAI conference on artificial intelligence

  7. Xian Y, Fu Z, Muthukrishnan S, De Melo G, Zhang Y (2019) Reinforcement knowledge graph reasoning for explainable recommendation. In: Proceedings of the 42nd international ACM SIGIR conference on research and development in information retrieval, pp 285–294

  8. Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. In: Proceedings of the 26th international conference on neural information processing systems, Volume 2, pp 2787–2795

  9. Nickel M, Tresp V, Kriegel H-P (2011) A three-way model for collective learning on multi-relational data. ICML 11:809–816

    Google Scholar 

  10. Dettmers T, Minervini P, Stenetorp P, Riedel S (2018) Convolutional 2D knowledge graph embeddings. In: Thirty-second AAAI conference on artificial intelligence

  11. Nguyen DQ, Nguyen TD, Nguyen DQ, Phung D (2017) A novel embedding model for knowledge base completion based on convolutional neural network. arXiv preprint arXiv:1712.02121

  12. Nguyen DQ, Vu T, Nguyen TD, Nguyen DQ, Phung D (2018) A capsule network-based embedding model for knowledge graph completion and search personalization. arXiv preprint arXiv:1808.04122

  13. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907

  14. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. arXiv preprint arXiv:1710.10903

  15. Nathani D, Chauhan J, Sharma C, Kaul M (2019) Learning attention-based embeddings for relation prediction in knowledge graphs. arXiv preprint arXiv:1906.01195

  16. Hu K, Liu H, Zhan C, Tang Y, Hao T (2020) A bi-directional relation aware network for link prediction in knowledge graph. In: International conference on neural computing for advanced applications. Springer, pp 259–271

  17. Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, pp 3111–3119

  18. Wang Z, Zhang J, Feng J, Chen Z (2014) Knowledge graph embedding by translating on hyperplanes. In: Twenty-eighth AAAI conference on artificial intelligence

  19. Lin Y, Liu Z, Sun M, Liu Y, Zhu X (2015) Learning entity and relation embeddings for knowledge graph completion. In: Twenty-ninth AAAI conference on artificial intelligence

  20. Ji G, He S, Xu L, Liu K, Zhao J (2015) Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (volume 1: Long Papers), pp 687–696

  21. Fan M, Zhou Q, Chang E, Zheng F (2014) Transition-based knowledge graph embedding with relational mapping properties. In: Proceedings of the 28th Pacific Asia conference on language, information and computing, pp 328–337

  22. Xiao H, Huang M, Zhu X (2016) From one point to a manifold: knowledge graph embedding for precise link prediction. In: Proceedings of the twenty-fifth international joint conference on artificial intelligence, pp 1315–1321

  23. Sun Z, Deng Z-H, Nie J-Y, Tang J (2018) Rotate: knowledge graph embedding by relational rotation in complex space. In: International conference on learning representations

  24. Yang B, Yih W-t, He X, Gao J, Deng L (2014) Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:1412.6575

  25. Nickel M, Rosasco L, Poggio T (2016) Holographic embeddings of knowledge graphs. In: Proceedings of the thirtieth AAAI conference on artificial intelligence, pp 1955–1961

  26. Trouillon T, Welbl J, Riedel S, Gaussier É, Bouchard G (2016) Complex embeddings for simple link prediction. In: International conference on machine learning, pp 2071–2080

  27. Kazemi SM, Poole D (2018) Simple embedding for link prediction in knowledge graphs. In: Advances in neural information processing systems, pp 4284–4295

  28. Balažević I, Allen C, Hospedales T (2019) Tucker: tensor factorization for knowledge graph completion. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), pp 5188–5197

  29. Cai L, Wang WY (2017) Kbgan: adversarial learning for knowledge graph embeddings. arXiv preprint arXiv:1711.04071

  30. Wang P, Li S, Pan R (2018) Incorporating gan for negative sampling in knowledge representation learning. In: Thirty-second AAAI conference on artificial intelligence

  31. Hu K, Liu H, Hao T (2019) A knowledge selective adversarial network for link prediction in knowledge graph. In: CCF international conference on natural language processing and Chinese computing. Springer, pp 171–183

  32. Liu H, Hu K, Wang F-L, Hao T (2020) Aggregating neighborhood information for negative sampling for knowledge graph embedding. Neural Comput Appl 32:17637–17653

    Article  Google Scholar 

  33. Schlichtkrull M, Kipf TN, Bloem P, Van Den Berg R, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. In: European semantic web conference. Springer, pp 593–607

  34. Wang Z, Ren Z, He C, Zhang P, Hu Y (2019) Robust embedding with multi-level structures for link prediction. In: Proceedings of the 28th international joint conference on artificial intelligence. AAAI Press, pp 5240–5246

  35. Yang S, Tian J, Zhang H, Yan J, He H, Jin Y (2019) Transms: knowledge graph embedding for complex relations by multidirectional semantics. In: Proceedings of the 28th international joint conference on artificial intelligence. AAAI Press, pp 1935–1942

  36. Rumelhart DE, Hinton GE, Williams RJ (1985) Learning internal representations by error propagation. Technical report, California Univ San Diego La Jolla Inst for Cognitive Science

  37. Bourlard H, Kamp Y (1988) Auto-association by multilayer perceptrons and singular value decomposition. Biol Cybern 59(4–5):291–294

    Article  MathSciNet  Google Scholar 

  38. Hinton GE, Salakhutdinov RR (2006) Reducing the dimensionality of data with neural networks. Science 313(5786):504–507

    Article  MathSciNet  Google Scholar 

  39. Bengio Y, Lamblin P, Popovici D, Larochelle H (2006) Greedy layer-wise training of deep networks. In: Proceedings of the 19th international conference on neural information processing systems, pp 153–160

  40. Wang W, Huang Y, Wang Y, Wang L (2014) Generalized autoencoder: a neural network framework for dimensionality reduction. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp 490–497

  41. Ranzato MA, Poultney C, Chopra S, LeCun Y (2006) Efficient learning of sparse representations with an energy-based model. In: Proceedings of the 19th international conference on neural information processing systems, pp 1137–1144

  42. Vincent P, Larochelle H, Bengio Y, Manzagol P-A (2008) Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th international conference on Machine learning, pp 1096–1103

  43. Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol P-A, Bottou L (2010) Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 11(12):3371–3408

    MathSciNet  MATH  Google Scholar 

  44. Rifai S, Vincent P, Muller X, Glorot X, Bengio Y (2011) Contractive auto-encoders: explicit invariance during feature extraction. In: Icml

  45. Kingma DP, Welling M (2013) Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114

  46. Ji Y, Zhang H, Zhang Z, Liu M (2021) Cnn-based encoder-decoder networks for salient object detection: a comprehensive review and recent advances. Inf Sci 546:835–857

    Article  MathSciNet  Google Scholar 

  47. Zhang Z, Tang Z, Zhang Z, Wang Y, Qin J, Wang M (2020) Fully-convolutional intensive feature flow neural network for text recognition. In: Proceedings of the 24th European conference on artificial intelligence

  48. Masci J, Meier U, Cireşan D, Schmidhuber J (2011) Stacked convolutional auto-encoders for hierarchical feature extraction. In: International conference on artificial neural networks. Springer, pp 52–59

  49. Takahashi R, Tian R, Inui K (2018) Interpretable and compositional relation learning by joint training with an autoencoder. In: Proceedings of the 56th annual meeting of the association for computational linguistics (volume 1: Long Papers), pp 2148–2159

  50. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008

  51. Gao X, Zhang Z, Mu T, Zhang X, Cui C, Wang M (2020) Self-attention driven adversarial similarity learning network. Pattern Recogn 105:107331–107341

    Article  Google Scholar 

  52. Dumoulin V, Visin F (2016) A guide to convolution arithmetic for deep learning. arXiv preprint arXiv:1603.07285

  53. Toutanova K, Chen D, Pantel P, Poon H, Choudhury P, Gamon M (2015) Representing text for joint embedding of text and knowledge bases. In: Proceedings of the 2015 conference on empirical methods in natural language processing, pp 1499–1509

  54. Carlson A, Betteridge J, Kisiel B, Settles B, Hruschka ER, Mitchell TM (2010) Toward an architecture for never-ending language learning. In: Twenty-Fourth AAAI conference on artificial intelligence

  55. Xiong W, Hoang T, Wang WY (2017) Deeppath: a reinforcement learning method for knowledge graph reasoning. arXiv preprint arXiv:1707.06690

Download references

Acknowledgements

This work is supported by National Natural Science Foundation of China (No. 61772146, No. 61772211, No. U1811263) and Natural Science Foundation of Guangdong (No. c20140500000225).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tianyong Hao.

Ethics declarations

Conflict of interest

The authors confirm that this article content has no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, K., Liu, H., Zhan, C. et al. Learning knowledge graph embedding with a bi-directional relation encoding network and a convolutional autoencoder decoding network. Neural Comput & Applic 33, 11157–11173 (2021). https://doi.org/10.1007/s00521-020-05654-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05654-4

Keywords

Navigation