Skip to main content

A Subject-aware Attention Hierarchical Tagger for Joint Entity and Relation Extraction

  • Conference paper
  • First Online:
Advanced Information Systems Engineering (CAiSE 2022)

Abstract

Joint entity and relation extraction aims to detect entities and relations from unstructured text by a single model. This task becomes challenging due to the problem of overlapping relational triples and the lack of internal interaction of triples. In this paper, we propose a Subject-aware Attention Hierarchical Tagger (SAHT) to overcome these challenges. Firstly, this model identifies all subjects through a subject tagger. Secondly, the subject-aware attention mechanism that incorporates the subject features is designed to construct the specific sentence representation for each subject. Finally, the object multi-relation tagger is utilized to extract objects and relations by this representation, and this process is regarded as a multi-label task. Based on this hierarchical extraction, SAHT can make full use of the internal characteristics of subjects to closely contact with the corresponding objects and relations. Experiments on two public datasets demonstrate that our SAHT achieves significant improvement in extracting overlapping relational triples compared with previous joint extraction models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The filtered dataset can be downloaded at: https://github.com/xiangrongzeng/copy_re..

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: 3rd International Conference on Learning Representations (2015)

    Google Scholar 

  2. Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Adversarial training for multi-context joint entity and relation extraction. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 2830–2836 (2018)

    Google Scholar 

  3. Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst. Appl. 114, 34–45 (2018)

    Article  Google Scholar 

  4. Bowen, Y., Zhang, Z., Su, J., Wang, Y., Liu, T., Wang, B., Li, S.: Joint extraction of entities and relations based on a novel decomposition strategy. In: Proceedings of the European Conference on Artificial Intelligence (2020)

    Google Scholar 

  5. Chan, Y.S., Roth, D.: Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 551–560 (2011)

    Google Scholar 

  6. Dai, D., Xiao, X., Lyu, Y., Dou, S., She, Q., Wang, H.: Joint extraction of entities and overlapping relations using position-attentive sequence labeling. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 6300–6308 (2019)

    Google Scholar 

  7. Fu, T.J., Li, P.H., Ma, W.Y.: Graphrel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1409–1418 (2019)

    Google Scholar 

  8. Gardent, C., Shimorina, A., Narayan, S., Perez-Beltrachini, L.: Creating training corpora for NLG micro-planning. In: 55th annual meeting of the Association for Computational Linguistics (ACL) (2017)

    Google Scholar 

  9. Gardner, M., et al.: AllenNLP: a deep semantic natural language processing platform. In: Proceedings of Workshop for NLP Open Source Software (NLP-OSS), pp. 1–6, July 2018

    Google Scholar 

  10. Gupta, P., Schütze, H., Andrassy, B.: Table filling multi-task recurrent neural network for joint entity and relation extraction. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 2537–2547, December 2016

    Google Scholar 

  11. Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference on Learning Representations (2015)

    Google Scholar 

  12. Mintz, M., Bills, S., Snow, R., Jurafsky, D.: Distant supervision for relation extraction without labeled data. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th IJCNLP of the AFNLP, pp. 1003–1011 (2009)

    Google Scholar 

  13. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1105–1116 (2016)

    Google Scholar 

  14. Miwa, M., Sasaki, Y.: Modeling joint entity and relation extraction with table representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1858–1869 (2014)

    Google Scholar 

  15. Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  16. Ren, X., et al.: Cotype: joint extraction of typed entities and relations with knowledge bases. In: Proceedings of the 26th International Conference on World Wide Web, pp. 1015–1024 (2017)

    Google Scholar 

  17. Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 148–163 (2010)

    Google Scholar 

  18. Sun, C., et al.: Extracting entities and relations with joint minimum risk training. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 2256–2265, October-November 2018

    Google Scholar 

  19. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  20. Wei, Z., Su, J., Wang, Y., Tian, Y., Chang, Y.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 1476–1488 (2020)

    Google Scholar 

  21. Zeng, D., Zhang, H., Liu, Q.: Copymtl: copy mechanism for joint extraction of entities and relations with multi-task learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 9507–9514 (2020)

    Google Scholar 

  22. Zeng, X., He, S., Zeng, D., Liu, K., Liu, S., Zhao, J.: Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 Conference on EMNLP and the 9th IJCNLP, pp. 367–377 (2019)

    Google Scholar 

  23. Zeng, X., Zeng, D., He, S., Liu, K., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, pp. 506–514 (2018)

    Google Scholar 

  24. Zhang, X., Goldwasser, D.: Sentiment tagging with partial labels using modular architectures. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 579–590, July 2019

    Google Scholar 

  25. Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 35–45 (2017)

    Google Scholar 

  26. Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pp. 1227–1236 (2017)

    Google Scholar 

Download references

Acknowledgment

This work was supported in part by the National Key Research and Development Program of China under Grants 2020YFC1807104.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yawei Zhao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhao, Y., Li, X. (2022). A Subject-aware Attention Hierarchical Tagger for Joint Entity and Relation Extraction. In: Franch, X., Poels, G., Gailly, F., Snoeck, M. (eds) Advanced Information Systems Engineering. CAiSE 2022. Lecture Notes in Computer Science, vol 13295. Springer, Cham. https://doi.org/10.1007/978-3-031-07472-1_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-07472-1_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-07471-4

  • Online ISBN: 978-3-031-07472-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics