Skip to main content
Log in

Chinese medical relation extraction based on multi-hop self-attention mechanism

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

The medical literature is the most important way to demonstrate academic achievements and academic exchanges. Massive medical literature has become a huge treasure trove of knowledge. It is necessary to automatically extract implicit medical knowledge from the medical literature. Medical relation extraction aims to automatically extract medical relations from the medical text for various medical researches. However, there are a few kinds of research in Chinese medical literature. Currently, the popular methods are based on neural networks, which focus on semantic information on one aspect of the sentence. However, complex semantic information in the sentence determines the relation between entities, the semantic information cannot be represented by one sentence vector. In this paper, we propose an attention-based model to extract the multi-aspect semantic information for the Chinese medical relation extraction by multi-hop attention mechanism. The model could generate multiple weight vectors for the sentence through each attention step, therefore, we can generate the different semantic representation of a sentence, respectively. Our model is evaluated by using Chinese medical literature from China National Knowledge Infrastructure (CNKI). It achieves an F1 score of 93.19% for therapeutic relation tasks and 73.47% for causal relation tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. https://gate.ac.uk/.

Abbreviations

NLP:

Natural language processing

CNKI:

China National Knowledge Infrastructure

PPIs:

Protein–protein interactions

DDIs:

Drug–drug interactions

CPIs:

Chemical–protein interactions

CDIs:

Chemical–disease interactions

SAE:

Stacked autoencoder

CNNs:

Convolutional neural networks

RNN:

Recurrent neural network

LSTM:

Long short term memory network

Bi-LSTM:

Bidirectional long short term memory network

SVM:

Support vector machine

McDepCNN:

Multi-channel dependency-based convolutional neural network

References

  1. Wang Y, You ZH, Yang S et al (2019) A high efficient biological language model for predicting protein–protein interactions. Cells 8(2):122

    Article  Google Scholar 

  2. Ryu JY, Kim HU, Lee SY (2018) Deep learning improves prediction of drug–drug and drug–food interactions. Proc Natl Acad Sci 115(18):E4304–E4311

    Article  Google Scholar 

  3. Kringelum J, Kjaerulff S K, Brunak S et al (2016) ChemProt-3.0: a global chemical biology diseases mapping. Database 2016

  4. Wei C H, Peng Y, Leaman R et al (2016) Assessing the state of the art in biomedical relation extraction: overview of the BioCreative V chemical-disease relation (CDR) task. Database 2016

  5. Bunescu R, Ge R, Kate RJ et al (2005) Comparative experiments on learning information extractors for proteins and their interactions. Artif Intell Med 33(2):139–155

    Article  Google Scholar 

  6. Segura Bedmar I, Martínez P, Herrero Zazo M (2013) Semeval-2013 task 9: extraction of drug-drug interactions from biomedical texts (ddiextraction 2013). Association for Computational Linguistics, pp 341–350

  7. Blaschke C, Valencia A (2002) The frame-based module of the SUISEKI information extraction system. IEEE Intell Syst 17(2):14–20

    Google Scholar 

  8. Corney DPA, Buxton BF, Langdon WB et al (2004) BioRAT: extracting biological information from full-length papers. Bioinformatics 20(17):3206–3213

    Article  Google Scholar 

  9. Alam F, Corazza A, Lavelli A et al (2016) A knowledge-poor approach to chemical-disease relation extraction. Database 071

  10. Kim S, Liu H, Yeganova L et al (2015) Extracting drug-drug interactions from literature using a rich feature-based linear kernel approach. J Biomed Inform 55:23–30

    Article  Google Scholar 

  11. Peng Y, Lu Z (2017) Deep learning for extracting protein-protein interactions from biomedical literature. arXiv preprint arXiv:1706.01556

  12. Zhang Y, Zheng W, Lin H et al (2017) Drug–drug interaction extraction via hierarchical RNNs on sequence and shortest dependency paths. Bioinformatics 34(5):828–835

    Article  Google Scholar 

  13. Zhang Y, Lin H, Yang Z et al (2018) A hybrid model based on neural networks for biomedical relation extraction. J Biomed Inform 81:83

    Article  Google Scholar 

  14. Lee K, Qadir A, Hasan SA, Datla V, Prakash A, Liu J, Farri O (2017) Adverse drug event detection in tweets with semi-supervised convolutional neural networks. In: Proceedings of the international conference on World Wide Web, pp 705–714

  15. Li F, Zhang M, Fu G, Ji D (2017) A neural joint model for entity and relation extraction from biomedical text. BMC Bioinform 18(1):198

    Article  Google Scholar 

  16. Alimova I, Solovyev V (2018) Interactive attention network for adverse drug reaction classification. In: Conference on artificial intelligence and natural language. Springer, pp 185–196

  17. Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781

  18. Zeng D, Liu K, Lai S et al (2014) Relation classification via convolutional deep neural network. In: Proceedings of the 25th international conference on computational linguistics (COLING), pp 2335–2344

  19. Hochreiter S (1998) The vanishing gradient problem during learning recurrent neural nets and problem solutions. Int J Uncertain Fuzziness Knowl Based Syst 6:107–116

    Article  Google Scholar 

  20. Dong Y, Liu P, Zhu Z et al (2019) A fusion model-based label embedding and self-interaction attention for text classification. IEEE Access 8:30548–30559

    Article  Google Scholar 

  21. Wu X, Cai Y, Li Q et al (2018) Combining contextual information by self-attention mechanism in convolutional neural networks for text classification. In: International conference on web information systems engineering, Springer, Cham, pp 453–467

  22. Du J, Han J, Way A et al (2018) Multi-level structured self-attentions for distantly supervised relation extraction. arXiv preprint arXiv:1809.00699

  23. Huang Y, Du J (2019) Self-attention enhanced CNNs and collaborative curriculum learning for distantly supervised relation extraction. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), pp 389–398

  24. Tran NK, Niedereée C (2018) Multihop attention networks for question answer matching. In: The 41st international ACM SIGIR conference on research & development in information retrieval, ACM, pp 325–334

  25. Zhou P, Shi W, Tian J et al (2016) Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th annual meeting of the association for computational linguistics (volume 2: short papers), pp 207–212

  26. Vaswani A, Shazeer N, Parmar N et al (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008

Download references

Funding

This work has been supported by the Natural Science Foundation of China (No. 61632011, 61572102). The Postdoctoral Science Foundation of China (No. 2018M641691). The Foundation of State Key Laboratory of Cognitive Intelligence, iFLYTEK, P.R. China (COGOS-20190001). The funding bodies did not play any role in the design of the study, data collection and analysis, or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongfei Lin.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, T., Lin, H., Tadesse, M.M. et al. Chinese medical relation extraction based on multi-hop self-attention mechanism. Int. J. Mach. Learn. & Cyber. 12, 355–363 (2021). https://doi.org/10.1007/s13042-020-01176-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-020-01176-6

Keywords

Navigation