Skip to main content

A Hybrid Model for Spatio-Temporal Information Recognition in COVID-19 Trajectory Text

  • Conference paper
  • First Online:
Web Information Systems and Applications (WISA 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13579))

Included in the following conference series:

Abstract

Since the outbreak of the COVID-19 epidemic at the end of 2019, the normalization of epidemic prevention and control has become one of the core tasks of the entire country. Health self-examination by checking the trajectory of diagnosed patients has gradually become everyone’s basic necessity and essential to epidemic prevention. The COVID-19 patient’s spatio-temporal information helps to facilitate the self-inspection of the masses of whether their trajectory overlaps with the confirmed cases, which promotes the epidemic prevention work. This paper, proposes a named entity recognition model to automatically identify the time and place information in the COVID-19 patient trajectory text. The model consists of an ALBERT layer, a Bi-GRU layer, and a GlobalPointer layer. The previous two layers jointly focus on extracting the context’s characteristics and the semantic dependencies. And the GlobalPointer layer extracts the corresponding named entities from a global perspective, which improves the recognition ability for the long-nested place and time entities. Compared to the conventional name entity recognition models, our proposed model has high effectiveness because it has a smaller parameter scale and faster training speed. We evaluate the proposed model using a dataset crawled from the official COVID-19 trajectory text. The F1-score of the model has reached 92.86%, which outperforms four traditional named entity recognition models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aone, C., Halverson, L., Hampton, T., Ramos-Santacruz, M.: Sra: Description of the ie2 system used for muc-7. In: Seventh Message Understanding Conference (MUC-7): Proceedings of a Conference Held in Fairfax, Virginia, April 29-May 1, 1998 (1998)

    Google Scholar 

  2. Berger, A., Della Pietra, S.A., Della Pietra, V.J.: A maximum entropy approach to natural language processing. Comput. linguist. 22(1), 39–71 (1996)

    Google Scholar 

  3. Black, W.J., Rinaldi, F., Mowatt, D.: Facile: Description of the ne system used for muc-7. In: Seventh Message Understanding Conference (MUC-7): Proceedings of a Conference Held in Fairfax, Virginia, April 29-May 1, 1998 (1998)

    Google Scholar 

  4. Chen, W., Zhang, Y., Isahara, H.: Chinese named entity recognition with conditional random fields. In: Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing, pp. 118–121 (2006)

    Google Scholar 

  5. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)

  6. Church, K.W.: Word2vec. Natural Lang. Eng. 23(1), 155–162 (2017)

    Article  Google Scholar 

  7. Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12(ARTICLE), 2493–2537 (2011)

    Google Scholar 

  8. Deng, B., Cheng, L.: Chinese named entity recognition method based on albert (2020)

    Google Scholar 

  9. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  10. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  11. Huang, Z., Xu, W., Yu, K.: Bidirectional lstm-crf models for sequence tagging. arXiv preprint arXiv:1508.01991 (2015)

  12. Humphreys, K., et al.: University of sheffield: Description of the lasie-ii system as used for muc-7. In: Seventh Message Understanding Conference (MUC-7): Proceedings of a Conference Held in Fairfax, Virginia, April 29-May 1, 1998 (1998)

    Google Scholar 

  13. Jin, C., Shi, Z., Li, W., Guo, Y.: Bidirectional lstm-crf attention-based model for chinese word segmentation. arXiv preprint arXiv:2105.09681 (2021)

  14. Krupka, G., Hausman, K.: Isoquest inc.: description of the netowl\(^{\rm TM}\) extractor system as used for muc-7. In: Seventh Message Understanding Conference (MUC-7): Proceedings of a Conference Held in Fairfax, Virginia, April 29-May 1, 1998 (1998)

    Google Scholar 

  15. Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: Albert: A lite bert for self-supervised learning of language representations. arXiv preprint arXiv:1909.11942 (2019)

  16. Liu, Y., et al.: Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692 (2019)

  17. Magerman, D.M.: Statistical decision-tree models for parsing. arXiv preprint cmp-lg/9504030 (1995)

    Google Scholar 

  18. Olah, C.: Understanding lstm networks (2018). https://colah.github.io/posts/2015-08-Understanding-LSTMs

  19. Rabiner, L., Juang, B.: An introduction to hidden markov models. IEEE ASSP Mag. 3(1), 4–16 (1986)

    Google Scholar 

  20. Shen, Y., Ma, X., Tan, Z., Zhang, S., Wang, W., Lu, W.: Locate and label: A two-stage identifier for nested named entity recognition. arXiv preprint arXiv:2105.06804 (2021)

  21. Su, J.: Generalize softmax + cross-entropy to multi-label classification problems (2020). https://spaces.ac.cn/archives/7359

  22. Su, J.: Globalpointer:deal with nested and non-nested ner in a unified way (2021). https://spaces.ac.cn/archives/8373

  23. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  24. Wallach, H.M.: Conditional random fields: An introduction. Technical Reports (CIS) p. 22 (2004)

    Google Scholar 

  25. Xu, L., et al.: Cluener 2020: Fine-grained name entity recognition for chinese. arxiv 2020. arXiv preprint arXiv:2001.04351

  26. Xu, L., Li, S., Wang, Y., Xu, L.: Named entity recognition of BERT-BiLSTM-CRF combined with self-attention. In: Xing, C., Fu, X., Zhang, Y., Zhang, G., Borjigin, C. (eds.) WISA 2021. LNCS, vol. 12999, pp. 556–564. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87571-8_48

    Chapter  Google Scholar 

  27. Yu, H.k., Zhang, H.p., Liu, Q., Lv, X., Shi, S.: Chinese named entity identification using cascaded hidden markov model. J. Commun. 27(2), 87–94 (2006)

    Google Scholar 

  28. Zhang, X., Wang, L.: Identification and analysis of chinese organization names. J. Chinese Inform. Process. 11(4), 22–33 (1997)

    Google Scholar 

Download references

Acknowledgements

This research is supported by National Natural Science Foundation of China (No. U1936206). We thank the reviewers for their constructive comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yanlong Wen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yu, H., Pan, X., Zhao, D., Wen, Y., Yuan, X. (2022). A Hybrid Model for Spatio-Temporal Information Recognition in COVID-19 Trajectory Text. In: Zhao, X., Yang, S., Wang, X., Li, J. (eds) Web Information Systems and Applications. WISA 2022. Lecture Notes in Computer Science, vol 13579. Springer, Cham. https://doi.org/10.1007/978-3-031-20309-1_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20309-1_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20308-4

  • Online ISBN: 978-3-031-20309-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics