skip to main content
10.1145/3357384.3358071acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
short-paper

Knowledge-aware Textual Entailment with Graph Attention Network

Published: 03 November 2019 Publication History

Abstract

Textual entailment is a central problem of language variability, which has been attracting a lot of interest and it poses significant issues in front of systems aimed at natural language understanding. Recently, various frameworks have been proposed for textual entailment recognition, ranging from traditional computational linguistics techniques to deep learning model based methods. However, recent deep neural networks that achieve the state of the art on textual entailment task only consider the context information of the given sentences rather than the real-world background information and knowledge beyond the context. In the paper, we propose a Knowledge-Context Interactive Textual Entailment Network (KCI-TEN) that learns graph level sentence representations by harnessing external knowledge graph with graph attention network. We further propose a text-graph interaction mechanism for neural based entailment matching learning, which endows the redundancy and noise with less importance and put emphasis on the informative representations. Experiments on the SciTail dataset demonstrate that KCI-TEN outperforms the state-of-the-art methods.

References

[1]
Marco Baroni, Raffaella Bernardi, Ngoc-Quynh Do, and Chung-chieh Shan. 2012. Entailment above the word level in distributional semantics. In ECACL. 23--32.
[2]
Qian Chen, Xiaodan Zhu, Zhen-Hua Ling, Diana Inkpen, and Si Wei. 2018. Neural Natural Language Inference Models Enhanced with External Knowledge. In ACL . 2406--2417.
[3]
Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. NIPS. 1024--1034.
[4]
Tushar Khot, Ashish Sabharwal, and Peter Clark. 2018. SciTail: A textual entailment dataset from science question answering. In Proceedings of AAAI .
[5]
Hugo Liu and Push Singh. 2004. ConceptNet-a-practical commonsense reasoning tool-kit. BT technology journal, Vol. 22, 4 (2004), 211--226.
[6]
Xiaodong Liu, Pengcheng He, Weizhu Chen, and Jianfeng Gao. 2019. Multi-Task Deep Neural Networks for Natural Language Understanding. arXiv preprint arXiv:1901.11504 (2019).
[7]
Todor Mihaylov, Peter Clark, Tushar Khot, and Ashish Sabharwal. 2018. Can a Suit of Armor Conduct Electricity? A New Dataset for Open Book Question Answering. In EMNLP. 2381--2391.
[8]
Sebastian Padó, Tae-Gil Noh, Asher Stern, Rui Wang, and Roberto Zanoli. 2015. Design and realization of a modular architecture for textual entailment. Natural Language Engineering, Vol. 21, 2 (2015), 167--200.
[9]
Ankur P Parikh, Oscar T"ackström, Dipanjan Das, and Jakob Uszkoreit. 2016. A decomposable attention model for natural language inference. arXiv preprint arXiv:1606.01933 (2016).
[10]
Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever. 2018. Improving language understanding by generative pre-training. (2018).
[11]
Tim Rockt"aschel, Edward Grefenstette, Karl Moritz Hermann, Tomávs Kovc iskỳ, and Phil Blunsom. 2015. Reasoning about entailment with neural attention. arXiv preprint arXiv:1509.06664 (2015).
[12]
Petar Velićkovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lió, and Yoshua Bengio. 2018. Graph Attention Networks. In ICLR .
[13]
Shuohang Wang and Jing Jiang. 2015. Learning natural language inference with LSTM. arXiv preprint arXiv:1512.08849 (2015).
[14]
Xiaoyan Wang, Pavan Kapanipathi, Ryan Musa, Mo Yu, Kartik Talamadupula, Ibrahim Abdelaziz, Maria Chang, Achille Fokoue, Bassem Makni, Nicholas Mattei, et almbox. 2019. Improving Natural Language Inference Using External Knowledge in the Science Questions Domain. In AAAI .
[15]
Desi Wen, Yong Liu, Kaiqi Yuan, Shangchun Si, and Ying Shen. 2017. Attention-aware path-based relation extraction for medical knowledge graph. In International Conference on Smart Computing and Communication. Springer, 321--331.
[16]
Wenpeng Yin, Dan Roth, and Hinrich Schütze. 2018. End-task oriented textual entailment via deep explorations of inter-sentence interactions. In ACL. 540--545.

Cited By

View all
  • (2024)Dynamic Demonstration Retrieval and Cognitive Understanding for Emotional Support ConversationProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3626772.3657695(774-784)Online publication date: 10-Jul-2024
  • (2024)Deep Learning for Detecting Entailment Between Requirements Using Semantics from Use Case Diagrams as Training Data: A Comparative Study2024 International Seminar on Intelligent Technology and Its Applications (ISITIA)10.1109/ISITIA63062.2024.10668363(669-674)Online publication date: 10-Jul-2024
  • (2024)Path-Aware Cross-Attention Network for Question AnsweringAdvances in Knowledge Discovery and Data Mining10.1007/978-981-97-2253-2_9(105-117)Online publication date: 25-Apr-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CIKM '19: Proceedings of the 28th ACM International Conference on Information and Knowledge Management
November 2019
3373 pages
ISBN:9781450369763
DOI:10.1145/3357384
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 November 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. graph attention network
  2. knowledge base
  3. textual entailment

Qualifiers

  • Short-paper

Funding Sources

Conference

CIKM '19
Sponsor:

Acceptance Rates

CIKM '19 Paper Acceptance Rate 202 of 1,031 submissions, 20%;
Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

Upcoming Conference

CIKM '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)24
  • Downloads (Last 6 weeks)1
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Dynamic Demonstration Retrieval and Cognitive Understanding for Emotional Support ConversationProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3626772.3657695(774-784)Online publication date: 10-Jul-2024
  • (2024)Deep Learning for Detecting Entailment Between Requirements Using Semantics from Use Case Diagrams as Training Data: A Comparative Study2024 International Seminar on Intelligent Technology and Its Applications (ISITIA)10.1109/ISITIA63062.2024.10668363(669-674)Online publication date: 10-Jul-2024
  • (2024)Path-Aware Cross-Attention Network for Question AnsweringAdvances in Knowledge Discovery and Data Mining10.1007/978-981-97-2253-2_9(105-117)Online publication date: 25-Apr-2024
  • (2020)Syntax-aware Natural Language Inference with Graph Matching Networks2020 International Conference on Technologies and Applications of Artificial Intelligence (TAAI)10.1109/TAAI51410.2020.00024(85-90)Online publication date: Dec-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media