Skip to main content

REET: Joint Relation Extraction and Entity Typing via Multi-task Learning

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11838))

Abstract

Relation Extraction (RE) and Entity Typing (ET) are two important tasks in natural language processing field. Existing methods for RE and ET usually handle them separately. However, relation extraction and entity typing have strong relatedness with each other, since entity types are informative for inferring relations between entities, and the relations can provide important information for predicting types of entities. Exploiting the relatedness between relation extraction and entity typing has the potential to improve the performance of both tasks. In this paper, we propose a neural network based approach to jointly train relation extraction and entity typing models using a multi-task learning framework. For relation extraction, we adopt a piece-wise Convolutional Neural Network model as sentence encoder. For entity typing, since there are multiple entities in one sentence, we design a couple-attention model based on Bidirectional Long Short-Term Memory network to obtain entity-specific representation of sentences. In our MTL frame, the two tasks share not only the low-level input embeddings but also the high-level task-specific semantic representations with each other. The experiment results on benchmark datasets demonstrate that our approach can effectively improve the performance of both relation extraction and entity typing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://ai.googleblog.com/2013/04/50000-lessons-on-how-to-read-relation.html.

  2. 2.

    On GDS dataset we only compare with some recent baselines since the dataset is newly released in year 2018.

References

  1. Dong, L., Wei, F., Sun, H., Zhou, M., Xu, K.: A hybrid neural model for type classification of entity mentions. In: IJCAI, pp. 1243–1249 (2015)

    Google Scholar 

  2. GuoDong, Z., Jian, S., Jie, Z., Min, Z.: Exploring various knowledge in relation extraction. In: ACL, pp. 427–434. Association for Computational Linguistics (2005)

    Google Scholar 

  3. Hoffmann, R., Zhang, C., Ling, X., Zettlemoyer, L., Weld, D.S.: Knowledge-based weak supervision for information extraction of overlapping relations. In: ACL, pp. 541–550 (2011)

    Google Scholar 

  4. Jat, S., Khandelwal, S., Talukdar, P.: Improving distantly supervised relation extraction using word and entity based attention. vol. abs/1804.06987 (2018)

    Google Scholar 

  5. Kambhatla, N.: Combining lexical, syntactic, and semantic features with maximum entropy models for extracting relations. In: ACL, p. 22. Association for Computational Linguistics (2004)

    Google Scholar 

  6. Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: ACL, vol. 1, pp. 2124–2133 (2016)

    Google Scholar 

  7. Liu, T., Zhang, X., Zhou, W., Jia, W.: Neural relation extraction via inner-sentence noise reduction and transfer learning. arXiv preprint arXiv:1808.06738 (2018)

  8. Mintz, M., Bills, S., Snow, R., Jurafsky, D.: Distant supervision for relation extraction without labeled data. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language, pp. 1003–1011 (2009)

    Google Scholar 

  9. Ratinov, L., Roth, D.: Design challenges and misconceptions in named entity recognition. In: CoNLL, pp. 147–155 (2009)

    Google Scholar 

  10. Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010. LNCS (LNAI), vol. 6323, pp. 148–163. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15939-8_10

    Chapter  Google Scholar 

  11. Ruder, S.: An overview of multi-task learning in deep neural networks. arXiv preprint arXiv:1706.05098 (2017)

  12. Schuster, M., Paliwal, K.K.: Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 45(11), 2673–2681 (1997)

    Article  Google Scholar 

  13. Shimaoka, S., Stenetorp, P., Inui, K., Riedel, S.: An attentive neural architecture for fine-grained entity type classification. In: NAACL, pp. 69–74 (2016)

    Google Scholar 

  14. Shimaoka, S., Stenetorp, P., Inui, K., Riedel, S.: Neural architectures for fine-grained entity type classification. In: EACL, vol. 1, pp. 1271–1280 (2017)

    Google Scholar 

  15. Søgaard, A., Goldberg, Y.: Deep multi-task learning with low level tasks supervised at lower layers. In: ACL, vol. 2, pp. 231–235 (2016)

    Google Scholar 

  16. Surdeanu, M., Tibshirani, J., Nallapati, R., Manning, C.D.: Multi-instance multi-label learning for relation extraction. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 455–465 (2012)

    Google Scholar 

  17. Ye, H., Chao, W., Luo, Z., Li, Z.: Jointly extracting relations with class ties via effective deep ranking. In: Meeting of the Association for Computational Linguistics, pp. 1810–1820 (2017)

    Google Scholar 

  18. Zeiler, M.D.: ADADELTA: an adaptive learning rate method. arXiv preprint arXiv:1212.5701 (2012)

  19. Zeng, D., Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: EMNLP, pp. 1753–1762 (2015)

    Google Scholar 

  20. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: COLING, pp. 2335–2344 (2014)

    Google Scholar 

Download references

Acknowledgement

This work was supported by the National Key R&D Program of China (2018YFC0831005), the Science and Technology Key R&D Program of Tianjin (18YFZCSF01370) and the National Social Science Fund of China (15BTQ056).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wenjun Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, H. et al. (2019). REET: Joint Relation Extraction and Entity Typing via Multi-task Learning. In: Tang, J., Kan, MY., Zhao, D., Li, S., Zan, H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2019. Lecture Notes in Computer Science(), vol 11838. Springer, Cham. https://doi.org/10.1007/978-3-030-32233-5_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32233-5_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32232-8

  • Online ISBN: 978-3-030-32233-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics