Abstract
Fact verification is a challenging task that requires retrieving evidence from a corpus and verifying claims. This paper proposes Co-attention Networks with Graph Transformer (CNGT), a novel end-to-end reasoning framework for fact verification. CNGT constructs an evidence graph given a claim and retrieved evidence, uses a graph transformer to capture semantic interactions among the claim and evidence, and learns global node representations of the evidence graph via self-attention mechanisms and block networks. Deep co-attention networks integrate and reason on the evidence and claim simultaneously. Experiments on FEVER, a public large-scale benchmark dataset, demonstrate that CNGT achieves a 72.84% FEVER score and a 76.93% label accuracy score, outperforming state-of-the-art baselines. CNGT has de-noising and integrated reasoning abilities and case studies show that it can explain reasoning at the evidence level.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Beck, D., Haffari, G., Cohn, T.: Graph-to-sequence learning using gated graph neural networks. In: ACL, pp. 273–283 (2018)
Chen, C., Cai, F., Hu, X., Chen, W., Chen, H.: HHGN: a hierarchical reasoning-based heterogeneous graph neural network for fact verification. Inf. Process. Manag. 58(5), 102659 (2021)
Chen, C., Cai, F., Hu, X., Zheng, J., Ling, Y., Chen, H.: An entity-graph based reasoning method for fact verification. Inf. Process. Manag. 58(3), 102472 (2021)
Chen, Q., Zhu, X., Ling, Z., Wei, S., Jiang, H., Inkpen, D.: Enhanced LSTM for natural language inference. In: ACL, pp. 1657–1668 (2017)
Gardner, M., et al.: Allennlp: a deep semantic natural language processing platform, vol. abs/1803.07640 (2018)
Hanselowski, A., et al.: UKP-athene: multi-sentence textual entailment for claim verification, vol. abs/1809.01479 (2018)
Hidey, C., Diab, M.: Team SWEEPer: joint sentence extraction and fact checking with pointer networks. In: FEVER (2018)
Koncel-Kedziorski, R., Bekal, D., Luan, Y., Lapata, M., Hajishirzi, H.: Text generation from knowledge graphs with graph transformers. In: NAACL-HLT, pp. 2284–2293 (2019)
Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: ALBERT: a lite BERT for self-supervised learning of language representations. In: ICLR (2020)
Li, T., Zhu, X., Liu, Q., Chen, Q., Chen, Z., Wei, S.: Several experiments on investigating pretraining and knowledge-enhanced models for natural language inference. CoRR abs/1904.12104 (2019)
Liu, Y., et al.: Roberta: a robustly optimized BERT pretraining approach. CoRR abs/1907.11692 (2019)
Liu, Z., Xiong, C., Sun, M., Liu, Z.: Fine-grained fact verification with kernel graph attention network. In: ACL, pp. 7342–7351 (2020)
Lu, J., Yang, J., Batra, D., Parikh, D.: Hierarchical question-image co-attention for visual question answering. In: NeurIPS, pp. 289–297 (2016)
Luken, J., Jiang, N., de Marneffe, M.C.: QED: a fact verification system for the FEVER shared task. In: FEVER, pp. 156–160 (2018)
Nie, Y., Chen, H., Bansal, M.: Combining fact extraction and verification with neural semantic matching networks. In: AAAI, pp. 6859–6866 (2019)
Nie, Y., Wang, S., Bansal, M.: Revealing the importance of semantic retrieval for machine reading at scale. In: EMNLP-IJCNLP, pp. 2553–2566 (2019)
Pan, L., Xie, Y., Feng, Y., Chua, T., Kan, M.: Semantic graphs for generating deep questions. In: ACL, pp. 1463–1475 (2020)
Qu, M., Gao, T., Xhonneux, L.A.C., Tang, J.: Few-shot relation extraction via Bayesian meta-learning on relation graphs. In: ICML, pp. 7867–7876 (2020)
Soleimani, A., Monz, C., Worring, M.: BERT for evidence retrieval and claim verification. In: ECIR, pp. 359–366 (2020)
Subramanian, S., Lee, K.: Hierarchical evidence set modeling for automated fact extraction and verification. In: EMNLP, pp. 7798–7809 (2020)
Thorne, J., Vlachos, A., Christodoulopoulos, C., Mittal, A.: FEVER: a large-scale dataset for fact extraction and verification. In: NAACL-HLT, pp. 809–819 (2018)
Thorne, J., Vlachos, A., Cocarascu, O., Christodoulopoulos, C., Mittal, A.: The fact extraction and verification (FEVER) shared task, vol. abs/1811.10971 (2018)
Vaswani, A., et al.: Attention is all you need. In: NeurIPS, pp. 5998–6008 (2017)
Xia, P., Wu, S., Durme, B.V.: Which *BERT? A survey organizing contextualized encoders. In: EMNLP, pp. 7516–7533 (2020)
Xiong, C., Zhong, V., Socher, R.: Dynamic coattention networks for question answering. In: ICLR (2017)
Yin, W., Roth, D.: Twowingos: a two-wing optimization strategy for evidential claim verification. In: EMNLP, pp. 105–114 (2018)
Yoneda, T., Mitchell, J., Welbl, J., Stenetorp, P., Riedel, S.: UCL machine reading group: four factor framework for fact finding (HexaF). In: FEVER, pp. 97–102 (2018)
Zhong, W., et al.: Reasoning over semantic-level graph for fact checking. In: ACL, pp. 6170–6180 (2020)
Zhou, J., et al.: GEAR: graph-based evidence aggregating and reasoning for fact verification. In: ACL, pp. 892–901 (2019)
Acknowledgement
This work is partially supported by NFSC-General Technology Joint Fund for Basic Research (No. U1936206) and the National Natural Science Foundation of China (No. 62172237, 62077031).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Yuan, J., Chen, C., Hou, C., Yuan, X. (2023). CNGT: Co-attention Networks with Graph Transformer for Fact Verification. In: Yang, X., et al. Advanced Data Mining and Applications. ADMA 2023. Lecture Notes in Computer Science(), vol 14177. Springer, Cham. https://doi.org/10.1007/978-3-031-46664-9_39
Download citation
DOI: https://doi.org/10.1007/978-3-031-46664-9_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-46663-2
Online ISBN: 978-3-031-46664-9
eBook Packages: Computer ScienceComputer Science (R0)