Skip to main content

Relation Extraction Based on Dual Attention Mechanism

  • Conference paper
  • First Online:
Data Science (ICPCSEE 2019)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1058))

Abstract

The traditional deep learning model has problems that the long-distance dependent information cannot be learned, and the correlation between the input and output of the model is not considered. And the information processing on the sentence set is still insufficient. Aiming at the above problems, a relation extraction method combining bidirectional GRU network and multi-attention mechanism is proposed. The word-level attention mechanism was used to extract the word-level features from the sentence, and the sentence-level attention mechanism was used to focus on the characteristics of sentence sets. The experimental verification in the NYT dataset was conducted. The experimental results show that the proposed method can effectively improve the F1 value of the relationship extraction.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bohui, Z., Wei, F., Yu, H., et al.: Entity relationship extraction based on multi-channel convolutional neural network. Appl. Res. Comput. 34(3), 689–692 (2017). (in Chinese)

    Google Scholar 

  2. Nguyen, T.H., Grishman, R.: Relation extraction: perspective from convolutional neural networks. In: Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing, vol. 39–48 (2015)

    Google Scholar 

  3. Zeng, D., Liu, K., Chen, Y., et al.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1753–1762 (2015)

    Google Scholar 

  4. Zhang, D, Wang, D.: Relation classification via recurrent neural network. arXiv preprint arXiv:1508.01006 (2015)

  5. Sundermeyer, M., Schlüter, R., Ney, H.: LSTM neural networks for language modeling. In: Thirteenth Annual Conference of the International Speech Communication Association (2012)

    Google Scholar 

  6. Wei, L.: Mining of Chinese entity relationship for “BIGCILIN”. Harbin Institute of Technology, 1–z (2016). (in Chinese)

    Google Scholar 

  7. Zhang, S., Zheng, D., Hu, X., et al.: Bidirectional long short-term memory networks for relation classification. In: Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation, pp. 73–78 (2015)

    Google Scholar 

  8. Hu, X.: Research on semantic relationship classification based on LSTM. Harbin Institute of Technology (2015). (in Chinese)

    Google Scholar 

  9. Christopoulou, F., Miwa, M., Ananiadou, S.: A walk-based model on entity graphs for relation extraction. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), vol. 2, pp. 81–88 (2018)

    Google Scholar 

  10. Phi, V.T., Santoso, J., Shimbo, M., et al.: Ranking-based automatic seed selection and noise reduction for weakly supervised relation extraction. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), vol. 2, pp. 89–95 (2018)

    Google Scholar 

  11. Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025 (2015)

  12. Zhou, P., Shi, W., Tian, J., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), vol. 2, pp. 207–212 (2016)

    Google Scholar 

  13. Lin, Y., Shen, S., Liu, Z., et al.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 2124–2133 (2016)

    Google Scholar 

  14. Wang, L., Cao, Z., De Melo, G., et al.: Relation classification via multi-level attention CNNs (2016)

    Google Scholar 

  15. Feng, X., Guo, J., Qin, B., et al.: Effective deep memory networks for distant supervised relation extraction. In: IJCAI, pp. 4002–4008 (2017)

    Google Scholar 

  16. Dey, R., Salemt, F.M.: Gate-variants of gated recurrent unit (GRU) neural networks. In: 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), pp. 1597–1600. IEEE (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xue Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, X., Rao, Y., Sun, L., Lu, Y. (2019). Relation Extraction Based on Dual Attention Mechanism. In: Cheng, X., Jing, W., Song, X., Lu, Z. (eds) Data Science. ICPCSEE 2019. Communications in Computer and Information Science, vol 1058. Springer, Singapore. https://doi.org/10.1007/978-981-15-0118-0_27

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-0118-0_27

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-0117-3

  • Online ISBN: 978-981-15-0118-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics