Skip to main content

Augmenting Context Representation with Triggers Knowledge for Relation Extraction

  • Conference paper
Intelligent Information Processing XI (IIP 2022)

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 643))

Included in the following conference series:

  • 673 Accesses

Abstract

Relation Extraction (RE) requires the model to classify the correct relation from a set of relation candidates given the corresponding sentence and two entities. Recent work mainly studies how to utilize more data or incorporate extra context information especially with Pre-trained Language Models (PLMs). However, these models still face with the challenges of avoiding being affected by irrelevant or misleading words. In this paper, we propose a novel model to help alleviate such deficiency. Specifically, our model automatically mines the triggers of the sentence iteratively with the sentence itself from the previous iteration, and augment the semantics of the context representation from BERT with both entity pair and triggers skillfully. We conduct extensive experiments to evaluate the proposed model and effectively obtain empirical improvement in TACRED.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alicante, A., Corazza, A.: Barrier features for classification of semantic relations. In: Proceedings of the International Conference Recent Advances in Natural Language Processing 2011, pp. 509–514 (2011)

    Google Scholar 

  2. Bunescu, R., Mooney, R.: A shortest path dependency Kernel for relation extraction. In: Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing, pp. 724–731 (2005)

    Google Scholar 

  3. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  4. Han, X., et al.: More data, more relations, more context and more openness: a review and outlook for relation extraction. arXiv preprint arXiv:2004.03186 (2020)

  5. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  6. Huffman, S.B.: Learning information extraction patterns from examples. In: Wermter, S., Riloff, E., Scheler, G. (eds.) IJCAI 1995. LNCS, vol. 1040, pp. 246–260. Springer, Heidelberg (1996). https://doi.org/10.1007/3-540-60925-3_51

    Chapter  Google Scholar 

  7. Jiang, H., et al.: Relation extraction using supervision from topic knowledge of relation labels. In: IJCAI, pp. 5024–5030 (2019)

    Google Scholar 

  8. Joshi, M., Chen, D., Liu, Y., Weld, D.S., Zettlemoyer, L., Levy, O.: SpanBERT: improving pre-training by representing and predicting spans. Trans. Assoc. Comput. Linguist. 8, 64–77 (2020)

    Article  Google Scholar 

  9. Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, vol. 1, Long Papers, pp. 2124–2133 (2016)

    Google Scholar 

  10. Liu, Y., Wei, F., Li, S., Ji, H., Zhou, M., Wang, H.: A dependency-based neural network for relation classification. arXiv preprint arXiv:1507.04646 (2015)

  11. Peters, M.E., et al.: Knowledge enhanced contextual word representations. arXiv preprint arXiv:1909.04164 (2019)

  12. Sarzynska-Wawer, J., et al.: Detecting formal thought disorder by deep contextualized word representations. Psychiatry Res. 304, 114135 (2021)

    Article  Google Scholar 

  13. Shi, P., Lin, J.: Simple BERT models for relation extraction and semantic role labeling. arXiv preprint arXiv:1904.05255 (2019)

  14. Soares, L.B., FitzGerald, N., Ling, J., Kwiatkowski, T.: Matching the blanks: distributional similarity for relation learning. arXiv preprint arXiv:1906.03158 (2019)

  15. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  16. Verga, P., Belanger, D., Strubell, E., Roth, B., McCallum, A.: Multilingual relation extraction using compositional universal schema. arXiv preprint arXiv:1511.06396 (2015)

  17. Wang, H., Lu, G., Yin, J., Qin, K.: Relation extraction: a brief survey on deep neural network based methods. In: 2021 The 4th International Conference on Software Engineering and Information Management, pp. 220–228 (2021)

    Google Scholar 

  18. Wang, R., et al.: K-adapter: infusing knowledge into pre-trained models with adapters. arXiv preprint arXiv:2002.01808 (2020)

  19. Wu, S., He, Y.: Enriching pre-trained language model with entity information for relation classification. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management, pp. 2361–2364 (2019)

    Google Scholar 

  20. Yamada, I., Asai, A., Shindo, H., Takeda, H., Matsumoto, Y.: LUKE: deep contextualized entity representations with entity-aware self-attention. arXiv preprint arXiv:2010.01057 (2020)

  21. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, pp. 2335–2344 (2014)

    Google Scholar 

  22. Zhang, D., Wang, D.: Relation classification via recurrent neural network. arXiv preprint arXiv:1508.01006 (2015)

  23. Zhang, Y., Qi, P., Manning, C.D.: Graph convolution over pruned dependency trees improves relation extraction. arXiv preprint arXiv:1809.10185 (2018)

  24. Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 35–45 (2017)

    Google Scholar 

  25. Zhang, Z., Han, X., Liu, Z., Jiang, X., Sun, M., Liu, Q.: ERNIE: enhanced language representation with informative entities. arXiv preprint arXiv:1905.07129 (2019)

Download references

Acknowledgement

The authors wish to thank the reviewers for their helpful comments and suggestions. This work was also supported by the National Key Research & Development Program (Grant No. 2018YFC0831700) and National Natural Science Foundation of China (Grant No. 61671064, No. 61732005).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shumin Shi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 IFIP International Federation for Information Processing

About this paper

Cite this paper

Li, E., Shi, S., Yang, Z., Huang, H.Y. (2022). Augmenting Context Representation with Triggers Knowledge for Relation Extraction. In: Shi, Z., Zucker, JD., An, B. (eds) Intelligent Information Processing XI. IIP 2022. IFIP Advances in Information and Communication Technology, vol 643. Springer, Cham. https://doi.org/10.1007/978-3-031-03948-5_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-03948-5_11

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-03947-8

  • Online ISBN: 978-3-031-03948-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics