Skip to main content

Cross-Relational Reasoning for Neural Tensor Networks

  • Conference paper
  • First Online:
Artificial Intelligence Applications and Innovations (AIAI 2024)

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 713))

  • 393 Accesses

Abstract

Neural tensor networks are knowledge graph embedding models which infer relationships between two given entities. Although demonstrated to be effective, any inference about an individual relation from a neural tensor network is isolated from the model’s intelligence about the other relations in the problem domain. We introduce cross-relational reasoning, a novel inference mechanism for neural tensor networks which intelligently coordinates all of the model’s relation-specific outputs to augment a prediction corresponding to a single relation. We frame the process of coordinating the relation-specific outputs as a meta-learning problem, not unlike stacked ensemble learning, and illustrate that cross-relational reasoning consistently outperforms the original inference mechanism on the WN18RR knowledge graph. We also explore modifications to the neural tensor network’s internal activation function, and illustrate that using ReLu or Elu can rapidly speed up the neural tensor network’s convergence at the cost of long-term improvement during training, and that using sigmoid universally improves the model’s performance in the context provided by this paper.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Note that the terminology used in this paper differs slightly from that used by the original research by Socher et al., 2013; for instance, we express the entities as h and t instead of \(e_1\) and \(e_2\), respectively, as the former terminology is more common in modern knowledge graph embeddings research [4, 14].

  2. 2.

    As an example of the trade-off between specificity and variety for the threshold-calculation, we found empirical evidence that significantly larger bounds actually under-performed relative to the interval \([-6, 6]\).

References

  1. Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems, vol. 26 (2013)

    Google Scholar 

  2. Buitinck, L., et al.: API design for machine learning software: experiences from the scikit-learn project. In: ECML PKDD Workshop: Languages for Data Mining and Machine Learning, pp. 108–122 (2013)

    Google Scholar 

  3. Falck, T., Coulter, D.: Training neural tensor networks with corrupted relations. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J.M. (eds.) ICAISC 2023. LNCS, vol. 14125, pp. 56–67. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-42505-9_6

    Chapter  Google Scholar 

  4. Ji, S., Pan, S., Cambria, E., Marttinen, P., Philip, S.Y.: A survey on knowledge graphs: representation, acquisition, and applications. IEEE Trans. Neural Netw. Learn. Syst. 33(2), 494–514 (2021)

    Article  MathSciNet  Google Scholar 

  5. Li, W., Zhu, L., Cambria, E.: Taylor’s theorem: a new perspective for neural tensor networks. Knowl.-Based Syst. 228, 107258 (2021)

    Article  Google Scholar 

  6. Miller, G.A.: Wordnet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)

    Article  Google Scholar 

  7. Nickel, M., Tresp, V., Kriegel, H.P., et al.: A three-way model for collective learning on multi-relational data. In: ICML, vol. 11, pp. 3104482–3104584 (2011)

    Google Scholar 

  8. Nielsen, M.A.: Neural Networks and Deep Learning, vol. 25. Determination press, San Francisco (2015)

    Google Scholar 

  9. Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. In: Advances in Neural Information Processing Systems, vol. 32, pp. 8024–8035. Curran Associates, Inc. (2019). http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf

  10. Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014). http://www.aclweb.org/anthology/D14-1162

  11. Qiu, X., Huang, X.: Convolutional neural tensor network architecture for community-based question answering. In: Twenty-Fourth International Joint Conference on Artificial Intelligence (2015)

    Google Scholar 

  12. Russell, S.J., Norvig, P.: Artificial Intelligence a Modern Approach, 4th edn. Pearson Education, Inc., London (2020)

    Google Scholar 

  13. Socher, R., Chen, D., Manning, C.D., Ng, A.: Reasoning with neural tensor networks for knowledge base completion. In: Advances in Neural Information Processing Systems, vol. 26 (2013)

    Google Scholar 

  14. Wang, Q., Mao, Z., Wang, B., Guo, L.: Knowledge graph embedding: a survey of approaches and applications. IEEE Trans. Knowl. Data Eng. 29(12), 2724–2743 (2017)

    Article  Google Scholar 

  15. Wilson, H.: WN18RR (2020). https://doi.org/10.6084/m9.figshare.11911272.v1, https://figshare.com/articles/dataset/WN18RR/11911272

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Duncan Coulter .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Falck, T., Coulter, D. (2024). Cross-Relational Reasoning for Neural Tensor Networks. In: Maglogiannis, I., Iliadis, L., Macintyre, J., Avlonitis, M., Papaleonidas, A. (eds) Artificial Intelligence Applications and Innovations. AIAI 2024. IFIP Advances in Information and Communication Technology, vol 713. Springer, Cham. https://doi.org/10.1007/978-3-031-63219-8_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-63219-8_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-63218-1

  • Online ISBN: 978-3-031-63219-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics