Skip to main content

CNGT: Co-attention Networks with Graph Transformer for Fact Verification

  • Conference paper
  • First Online:
Advanced Data Mining and Applications (ADMA 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14177))

Included in the following conference series:

  • 489 Accesses

Abstract

Fact verification is a challenging task that requires retrieving evidence from a corpus and verifying claims. This paper proposes Co-attention Networks with Graph Transformer (CNGT), a novel end-to-end reasoning framework for fact verification. CNGT constructs an evidence graph given a claim and retrieved evidence, uses a graph transformer to capture semantic interactions among the claim and evidence, and learns global node representations of the evidence graph via self-attention mechanisms and block networks. Deep co-attention networks integrate and reason on the evidence and claim simultaneously. Experiments on FEVER, a public large-scale benchmark dataset, demonstrate that CNGT achieves a 72.84% FEVER score and a 76.93% label accuracy score, outperforming state-of-the-art baselines. CNGT has de-noising and integrated reasoning abilities and case studies show that it can explain reasoning at the evidence level.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.mediawiki.org/wiki/API: Main_page.

References

  1. Beck, D., Haffari, G., Cohn, T.: Graph-to-sequence learning using gated graph neural networks. In: ACL, pp. 273–283 (2018)

    Google Scholar 

  2. Chen, C., Cai, F., Hu, X., Chen, W., Chen, H.: HHGN: a hierarchical reasoning-based heterogeneous graph neural network for fact verification. Inf. Process. Manag. 58(5), 102659 (2021)

    Article  Google Scholar 

  3. Chen, C., Cai, F., Hu, X., Zheng, J., Ling, Y., Chen, H.: An entity-graph based reasoning method for fact verification. Inf. Process. Manag. 58(3), 102472 (2021)

    Article  Google Scholar 

  4. Chen, Q., Zhu, X., Ling, Z., Wei, S., Jiang, H., Inkpen, D.: Enhanced LSTM for natural language inference. In: ACL, pp. 1657–1668 (2017)

    Google Scholar 

  5. Gardner, M., et al.: Allennlp: a deep semantic natural language processing platform, vol. abs/1803.07640 (2018)

    Google Scholar 

  6. Hanselowski, A., et al.: UKP-athene: multi-sentence textual entailment for claim verification, vol. abs/1809.01479 (2018)

    Google Scholar 

  7. Hidey, C., Diab, M.: Team SWEEPer: joint sentence extraction and fact checking with pointer networks. In: FEVER (2018)

    Google Scholar 

  8. Koncel-Kedziorski, R., Bekal, D., Luan, Y., Lapata, M., Hajishirzi, H.: Text generation from knowledge graphs with graph transformers. In: NAACL-HLT, pp. 2284–2293 (2019)

    Google Scholar 

  9. Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: ALBERT: a lite BERT for self-supervised learning of language representations. In: ICLR (2020)

    Google Scholar 

  10. Li, T., Zhu, X., Liu, Q., Chen, Q., Chen, Z., Wei, S.: Several experiments on investigating pretraining and knowledge-enhanced models for natural language inference. CoRR abs/1904.12104 (2019)

    Google Scholar 

  11. Liu, Y., et al.: Roberta: a robustly optimized BERT pretraining approach. CoRR abs/1907.11692 (2019)

    Google Scholar 

  12. Liu, Z., Xiong, C., Sun, M., Liu, Z.: Fine-grained fact verification with kernel graph attention network. In: ACL, pp. 7342–7351 (2020)

    Google Scholar 

  13. Lu, J., Yang, J., Batra, D., Parikh, D.: Hierarchical question-image co-attention for visual question answering. In: NeurIPS, pp. 289–297 (2016)

    Google Scholar 

  14. Luken, J., Jiang, N., de Marneffe, M.C.: QED: a fact verification system for the FEVER shared task. In: FEVER, pp. 156–160 (2018)

    Google Scholar 

  15. Nie, Y., Chen, H., Bansal, M.: Combining fact extraction and verification with neural semantic matching networks. In: AAAI, pp. 6859–6866 (2019)

    Google Scholar 

  16. Nie, Y., Wang, S., Bansal, M.: Revealing the importance of semantic retrieval for machine reading at scale. In: EMNLP-IJCNLP, pp. 2553–2566 (2019)

    Google Scholar 

  17. Pan, L., Xie, Y., Feng, Y., Chua, T., Kan, M.: Semantic graphs for generating deep questions. In: ACL, pp. 1463–1475 (2020)

    Google Scholar 

  18. Qu, M., Gao, T., Xhonneux, L.A.C., Tang, J.: Few-shot relation extraction via Bayesian meta-learning on relation graphs. In: ICML, pp. 7867–7876 (2020)

    Google Scholar 

  19. Soleimani, A., Monz, C., Worring, M.: BERT for evidence retrieval and claim verification. In: ECIR, pp. 359–366 (2020)

    Google Scholar 

  20. Subramanian, S., Lee, K.: Hierarchical evidence set modeling for automated fact extraction and verification. In: EMNLP, pp. 7798–7809 (2020)

    Google Scholar 

  21. Thorne, J., Vlachos, A., Christodoulopoulos, C., Mittal, A.: FEVER: a large-scale dataset for fact extraction and verification. In: NAACL-HLT, pp. 809–819 (2018)

    Google Scholar 

  22. Thorne, J., Vlachos, A., Cocarascu, O., Christodoulopoulos, C., Mittal, A.: The fact extraction and verification (FEVER) shared task, vol. abs/1811.10971 (2018)

    Google Scholar 

  23. Vaswani, A., et al.: Attention is all you need. In: NeurIPS, pp. 5998–6008 (2017)

    Google Scholar 

  24. Xia, P., Wu, S., Durme, B.V.: Which *BERT? A survey organizing contextualized encoders. In: EMNLP, pp. 7516–7533 (2020)

    Google Scholar 

  25. Xiong, C., Zhong, V., Socher, R.: Dynamic coattention networks for question answering. In: ICLR (2017)

    Google Scholar 

  26. Yin, W., Roth, D.: Twowingos: a two-wing optimization strategy for evidential claim verification. In: EMNLP, pp. 105–114 (2018)

    Google Scholar 

  27. Yoneda, T., Mitchell, J., Welbl, J., Stenetorp, P., Riedel, S.: UCL machine reading group: four factor framework for fact finding (HexaF). In: FEVER, pp. 97–102 (2018)

    Google Scholar 

  28. Zhong, W., et al.: Reasoning over semantic-level graph for fact checking. In: ACL, pp. 6170–6180 (2020)

    Google Scholar 

  29. Zhou, J., et al.: GEAR: graph-based evidence aggregating and reasoning for fact verification. In: ACL, pp. 892–901 (2019)

    Google Scholar 

Download references

Acknowledgement

This work is partially supported by NFSC-General Technology Joint Fund for Basic Research (No. U1936206) and the National Natural Science Foundation of China (No. 62172237, 62077031).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chen Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yuan, J., Chen, C., Hou, C., Yuan, X. (2023). CNGT: Co-attention Networks with Graph Transformer for Fact Verification. In: Yang, X., et al. Advanced Data Mining and Applications. ADMA 2023. Lecture Notes in Computer Science(), vol 14177. Springer, Cham. https://doi.org/10.1007/978-3-031-46664-9_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46664-9_39

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46663-2

  • Online ISBN: 978-3-031-46664-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics