Skip to main content

Tree-Capsule: Tree-Structured Capsule Network for Improving Relation Extraction

  • Conference paper
  • First Online:
Book cover Advances in Knowledge Discovery and Data Mining (PAKDD 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 12714))

Included in the following conference series:

Abstract

Relation extraction benefits a variety of applications requiring relational understanding of unstructured texts, such as question answering. Recently, capsule network-based models have been proposed for improving relation extraction with better capability of modeling complex entity relations. However, they fail to capture the syntactic structure information of a sentence which has proven to be useful for relation extraction. In this paper, we propose a Tree-structured Capsule network based model for improving sentence-level Relation Extraction (TCRE), which seamlessly incorporates the syntax tree (Generally, syntax trees include constituent trees and dependency trees.) information (constituent tree is used in this work). Particularly, we design a novel tree-structured capsule network (Tree-Capsule network) to encode the constituent tree. Additionally, an entity-aware routing algorithm for Tree-Capsule network is proposed to pay attention to the critical relevant information, further improving the relation extraction of the target entities. Experimental results on standard datasets demonstrate that our TCRE significantly improves the performance of relation extraction by incorporating the syntactic structure information.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    \( \text {hardtanh}(x)= \text {min}(\text {max}(x, -1), 1) \).

  2. 2.

    https://nlp.stanford.edu/software/lex-parser.html.

  3. 3.

    \( \text {Squash}(\varvec{x}) = \frac{\Vert \varvec{x}\Vert \cdot \varvec{x}}{1 + \Vert \varvec{x}\Vert ^2} \), \(\varvec{x}\) is a capsule.

References

  1. Aly, R., Remus, S., Biemann, C.: Hierarchical multi-label classification of text with capsule networks. In: ACL 2019, pp. 323–330 (2019)

    Google Scholar 

  2. Baldini Soares, L., FitzGerald, N., Ling, J., Kwiatkowski, T.: Matching the blanks: distributional similarity for relation learning. In: ACL 2019, pp. 2895–2905 (2019)

    Google Scholar 

  3. Chen, J., Gong, X., Chen, X., Ma, Z.: Attribute-driven capsule network for entity relation prediction. In: Lauw, H.W., Wong, R.C.-W., Ntoulas, A., Lim, E.-P., Ng, S.-K., Pan, S.J. (eds.) PAKDD 2020. LNCS (LNAI), vol. 12084, pp. 675–686. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-47426-3_52

    Chapter  Google Scholar 

  4. Chen, Z., Qian, T.: Transfer capsule network for aspect level sentiment classification. In: ACL 2019, pp.547–556 (2019)

    Google Scholar 

  5. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT 2019, pp. 4171–4186 (2019)

    Google Scholar 

  6. Hendrickx, I., et al.: SemEval-2010 task 8: multi-way classification of semantic relations between pairs of nominals. In: SemEval@ACL 2010, pp. 33–38 (2010)

    Google Scholar 

  7. Hu, L., Zhang, L., Shi, C., Nie, L., Guan, W., Yang, C.: Improving distantly-supervised relation extraction with joint label embedding. In: EMNLP-IJCNLP 2019, pp. 3821–3829 (2019)

    Google Scholar 

  8. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: ACL 2016, pp. 1105–1116 (2016)

    Google Scholar 

  9. Rink, B., Harabagiu, S.: UTD: classifying semantic relations by combining lexical and semantic resources. In: SemEval@ACL 2010, pp. 256–259 (2010)

    Google Scholar 

  10. Sabour, S., Frosst, N., Hinton, G.E.: Dynamic routing between capsules. NIPS 2017, pp. 3859–3869 (2017)

    Google Scholar 

  11. Tai, K., Socher, R., Manning, C.: Improved semantic representations from tree-structured long short-term memory networks. In: ACL 2015, pp. 1556–1566 (2015)

    Google Scholar 

  12. Wu, S., He, Y.: Enriching pre-trained language model with entity information for relation classification. In: CIKM 2019, pp. 2361–2364 (2019)

    Google Scholar 

  13. Xu, Y., Mou, L., Li, G., Chen, Y., Peng, H., Jin, Z.: Classifying relations via long short term memory networks along shortest dependency paths. In: EMNLP 2015, pp. 1785–1794 (2015)

    Google Scholar 

  14. Yang, M., Zhao, W., Ye, J., Lei, Z., Zhao, Z., Zhang, S.: Investigating capsule networks with dynamic routing for text classification. In: EMNLP 2018, pp. 3110–3119 (2018)

    Google Scholar 

  15. Yu, M., Yin, W., Hasan, K.S., dos Santos, C., Xiang, B., Zhou, B.: Improved neural relation detection for knowledge base question answering. In: ACL 2017, pp. 571–581 (2017)

    Google Scholar 

  16. Zeng, D., Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: EMNLP 2015, pp. 1753–1762 (2015)

    Google Scholar 

  17. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: COLING 2014, pp. 2335–2344 (2014)

    Google Scholar 

  18. Zhang, N., Deng, S., Sun, Z., Chen, X., Zhang, W., Chen, H.: Attention-based capsule networks with dynamic routing for relation extraction. In: EMNLP 2018, pp. 986–992 (2018)

    Google Scholar 

  19. Zhang, X., Li, P., Jia, W., Zhao, H.: Multi-labeled relation extraction with attentive capsule network. In: AAAI 2019, pp. 7484–7491 (2019)

    Google Scholar 

  20. Zhang, Y., Qi, P., Manning, C.D.: Graph convolution over pruned dependency trees improves relation extraction. EMNLP 2018, 2205–2215 (2018)

    Google Scholar 

  21. Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: EMNLP 2017, pp. 35–45 (2017)

    Google Scholar 

  22. Zhou, P., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: ACL 2016, pp. 207–212 (2016)

    Google Scholar 

Download references

Acknowledgments

This work is supported in part by the National Natural Science Foundation of China (No.U20B2045, 61806020, 61772082) and the National Key Research and Development Program of China (2018YFB1402600).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chuan Shi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, T. et al. (2021). Tree-Capsule: Tree-Structured Capsule Network for Improving Relation Extraction. In: Karlapalem, K., et al. Advances in Knowledge Discovery and Data Mining. PAKDD 2021. Lecture Notes in Computer Science(), vol 12714. Springer, Cham. https://doi.org/10.1007/978-3-030-75768-7_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-75768-7_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-75767-0

  • Online ISBN: 978-3-030-75768-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics