skip to main content
10.1145/3622896.3622905acmotherconferencesArticle/Chapter ViewAbstractPublication PagesccrisConference Proceedingsconference-collections
research-article

RSAA: Relation-Specific Attention and Global Alignment Based Joint Entity and Relation Extraction

Published: 03 October 2023 Publication History

Abstract

Joint relational triple extraction is a crucial task in knowledge graph construction. However, most of these methods either perform a large number of redundant predictions on relations, resulting in slower inference speed or ignore the semantic information of relations in the task of extracting the subject and object. To overcome these deficiencies, we propose a model for joint extraction of entities and relations, named RSAA. Concretely, RSAA consists of a Relation-Based Attention module, a Relation-Specific Entity Extraction module, and a Subject-Object Alignment module. We first detect the relation in the text and add relational features to the sentence through the attention mechanism. Then in the Relation-Specific Entity Extraction module, extract the subject and object for the detected subset of relations instead of all relations. Finally, align the subject and object to a triple through a Subject-Object Alignment module. Experiments on NYT, NYT*, NYT10, WebNLG, and WebNLG* datasets demonstrate that the proposed model can efficiently extract overlapping triples and obtain strong generalization ability and faster inference speed, which outperforms other baselines.

References

[1]
Makoto Miwa and Yutaka Sasaki. 2014. Modeling joint entity and relation extraction with table representation. In EMNLP, pp. 1858-1869(2014).
[2]
Hongbin Ye, Ningyu Zhang, Shumin Deng, Mosha Chen, Chuanqi Tan, Fei Huang, Huajun Chen. 2021. Contrastive Triple Extraction with Generative Transformer. In AAAI, pp. 14257-14265(2021).
[3]
Claire Gardent, Anastasia Shimorina, Shashi Narayan, and Laura Perez-Beltrachini. 2017. Creating Training Corpora for NLG Micro-Planners. In ACL (Volume 1: Long Papers), pp. 179-188(2017).
[4]
YuMing Shang, Heyan Huang, XianLing Mao.2022. OneRel: Joint Entity and Relation Extraction with One Module in One Step. In AAAI, pp 11285-11293(2022).
[5]
Daojian Zeng, Ranran Haoran Zhang and Qianying Liu.2020. CopyMTL: Copy Mechanism for Joint Extraction of Entities and Relations with Multi-Task Learning. In AAAI pp. 9507-9514(2020).
[6]
Yankai Lin, Shiqi Shen, Zhiyuan Liu, Huanbo Luan, and Maosong Sun. 2016. Neural relation extraction with selective attention over instances. In ACL (Volume 1: Long Papers), pp. 2124-2133(2016).
[7]
Xiang Ren, Zeqiu Wu, Wenqi He, Meng Qu, Clare R. Voss, Heng Ji, Tarek F. Abdelzaher, Jiawei Han.2017. Cotype: Joint extraction of typed entities and relations with knowledge bases. In Proceedings of the 26th International Conference on World Wide Web, pp. 1015-1024(2017).
[8]
Suncong Zheng, Feng Wang, Hongyun Bao, Yuexing Hao, Peng Zhou, and Bo Xu. 2017. Joint Extraction of Entities and Relations Based on a Novel Tagging Scheme. In ACL, pp. 1227–1236(2017).
[9]
Hengyi Zheng, Rui Wen, Xi Chen, Yifan Yang, Yunyan Zhang, Ziheng Zhang, Ningyu Zhang, Bin Qin, Xu Ming, and Yefeng Zheng. 2021. PRGC: Potential relation and global correspondence based joint relational triple extraction. In ACL (Volume 1:Long Papers), pp. 6625-6235(2021).
[10]
Dmitry Zelenko Chinatsu Aone and Anthony Richardella.2003. Kernel methods for relation extraction. Journal of Machine Learning Research 3, 6, 1083–1106(2003).
[11]
Xiaoya Li, Fan Yin, Zijun Sun, Xiayu Li, Arianna Yuan, Duo Chai, Mingxin Zhou, and Jiwei Li. 2019. Entity-relation extraction as multi-turn question answering. In ACL, pp. 1340-1350, Florence, Italy(2019).
[12]
Xiangrong Zeng, Daojian Zeng, Shizhu He, Kang Liu, and Jun Zhao. 2018. Extracting Relational Facts by an End-to-End Neural Model with Copy Mechanism. In ACL (Volume 1: Long Papers), Vol. 1, pp. 506-514(2018).
[13]
Yee Seng Chan and Dan Roth. 2011. Exploiting Syntactico-Semantic Structures for Relation Extraction. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 551–560(2011).
[14]
Yaojie Lu, Qing Liu, Dai Dai, Xinyan Xiao, Hongyu Lin, Xianpei Han, Le Sun, and Hua Wu. 2022. Unified Structure Generation for Universal Information Extraction. In ACL (Volume 1: Long Papers), pp. 5755-5772, Dublin, Ireland(2022).
[15]
Zhepei Wei, Jianlin Su, Yue Wang, Yuan Tian, and Yi Chang. 2020. A Novel Cascade Binary Tagging Framework for Relational Triple Extraction. In ACL, Online, pp. 1476–1488,(2020)
[16]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. Bert: Pre-training of deep bidirectional transformers for language understanding. In NAACL, Volume 1 (Long and Short Papers), pp. 4171-4186. Minneapolis, Minnesota(2019).
[17]
Timothy Dozat and Christopher D. Manning. 2016. Deep Biaffine Attention for Neural Dependency Parsing. In ICLR (Poster) (2016).
[18]
Juntao Yu, Bernd Bohnet, and Massimo Poesio. 2020. Named Entity Recognition as Dependency Parsing. In ACL, pp. 6470-6476(2020).
[19]
Sebastian Riedel, Limin Yao and Andrew McCallum.2010. Modeling relations and their mentions without labeled text. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 148-163(2010)
[20]
Yucheng Wang, Bowen Yu, Yueyang Zhang, Tingwen Liu, Hongsong Zhu, and Limin Sun. 2020. TPLinker: Single-stage Joint Extraction of Entities and Relations Through Token Pair Linking. In COLING, Spain (Online), pp. 1572–1582 (2020).
[21]
Ryuichi Takanobu, Tianyang Zhang, Jiexi Liu and Minlie Huang. 2019. A Hierarchical Framework for Relation Extraction with Reinforcement Learning. In AAAI, pp. 7072-7079(2019)
[22]
Tsu-Jui Fu, Peng-Hsuan Li, and Wei-Yun Ma. 2019. GraphRel: Modeling text as relational graphs for joint entity and relation extraction. In ACL, pp. 1409-1418. Florence, Italy (2019)
[23]
Suncong Zheng, Feng Wang, Hongyun Bao, Yuexing Hao, Peng Zhou, and Bo Xu. 2017. Joint extraction of entities and relations based on a novel decomposition strategy. In ECAI, pp. 2282-2289(2020)
[24]
Yue Yuan, Xiaofei Zhou, Shirui Pan, Qiannan Zhu, Zeliang Song and Li Guo: A relation-specific attention network for joint entity and relation extraction. In IJCAI, pp. 4054-4060(2020)
[25]
Taiqu Lai, Lianglun Cheng, Depei Wang, Haiming Ye  and Weiwen Zhang.2022. RMAN: Relational multi-head attention neural network for joint extraction of entities and relations. In Appl Intell 52, pp. 3132-3142(2022)
[26]
Kai Sun, Richong Zhang, Samuel Mensah, Yongyi Mao and Xudong Liu.2021. Progressive Multitask Learning with Controlled Information Flow for Joint Entity and Relation Extraction. In AAAI, 35, pp. 13851-13859(2021)

Index Terms

  1. RSAA: Relation-Specific Attention and Global Alignment Based Joint Entity and Relation Extraction
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image ACM Other conferences
          CCRIS '23: Proceedings of the 2023 4th International Conference on Control, Robotics and Intelligent System
          August 2023
          215 pages
          ISBN:9798400708190
          DOI:10.1145/3622896
          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 03 October 2023

          Permissions

          Request permissions for this article.

          Check for updates

          Qualifiers

          • Research-article
          • Research
          • Refereed limited

          Conference

          CCRIS 2023

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • 0
            Total Citations
          • 48
            Total Downloads
          • Downloads (Last 12 months)23
          • Downloads (Last 6 weeks)1
          Reflects downloads up to 16 Feb 2025

          Other Metrics

          Citations

          View Options

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format.

          HTML Format

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media