Skip to main content

Enhancing Entity Linking with Contextualized Entity Embeddings

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13552))

Abstract

Entity linking (EL) in written language domains has been extensively studied, but EL of spoken language is still unexplored. We propose a conceptually simple and highly effective two-stage approach to tackle this issue. The first stage retrieves candidates with a dual encoder, which independently encodes mention context and entity descriptions. Each candidate is then reranked by a LUKE-based cross-encoder, which concatenates the mention and entity description. Different from previous cross-encoder which takes only words as input, our model adds entities into input. Experiments demonstrate that our model does not need large-scale training on Wikipedia corpus, and outperforms all previous models with or without Wikipedia training. Our approach ranks the \(1^\textrm{st}\) in NLPCC 2022 Shared Task on Speech EL Track 2 (Entity Disambiguation-Only).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The Wikidata page for the entity “George W. Bush”: https://www.wikidata.org/wiki/Q207.

  2. 2.

    We use KILT’s processed Wikipedia dump, available at http://dl.fbaipublicfiles.com/KILT/kilt_knowledgesource.json.

  3. 3.

    The training set and validation set of Wikipedia hyperlinks can be found at https://github.com/facebookresearch/KILT.

  4. 4.

    For the number of candidates, according to Wu et al. [26], \(K=10\) is optimal, and increasing K to 100 gives minimal improvement but \(10\times \) run-time in reranking. Therefore we choose \(K=10\) finally.

  5. 5.

    BLINK checkpoints can be downloaded at https://github.com/facebookresearch/BLINK/blob/main/download_blink_models.sh.

  6. 6.

    We use alias table from GENRE [7] repository: https://dl.fbaipublicfiles.com/GENRE/mention2wikidataID_with_titles_label_alias_redirect.pkl.

  7. 7.

    https://radimrehurek.com/gensim_3.8.3/summarization/bm25.html.

References

  1. Barba, E., Procopio, L., Navigli, R.: ExtEnD: extractive entity disambiguation. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Dublin, Ireland, pp. 2478–2488. Association for Computational Linguistics, May 2022

    Google Scholar 

  2. Beltagy, I., Peters, M.E., Cohan, A.: Longformer: the long-document transformer. arXiv:2004.05150 (2020)

  3. Benton, A., Dredze, M.: Entity linking for spoken language. In: Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Denver, Colorado, pp. 225–230. Association for Computational Linguistics, May–June 2015

    Google Scholar 

  4. Botha, J.A., Shan, Z., Gillick, D.: Entity linking in 100 languages. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 7833–7845. Association for Computational Linguistics, November 2020. https://aclanthology.org/2020.emnlp-main.630

  5. De Cao, N., Aziz, W., Titov, I.: Question answering by reasoning across documents with graph convolutional networks. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Minneapolis, Minnesota, pp. 2306–2317. Association for Computational Linguistics, June 2019

    Google Scholar 

  6. De Cao, N., Aziz, W., Titov, I.: Highly parallel autoregressive entity linking with discriminative correction. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, Punta Cana, Dominican Republic, pp. 7662–7669. Association for Computational Linguistics, November 2021. https://aclanthology.org/2021.emnlp-main.604

  7. De Cao, N., Izacard, G., Riedel, S., Petroni, F.: Autoregressive entity retrieval. In: International Conference on Learning Representations (2021). https://openreview.net/forum?id=5k8F6UU39V

  8. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Minneapolis, Minnesota, pp. 4171–4186. Association for Computational Linguistics, June 2019. https://aclanthology.org/N19-1423

  9. Gillick, D., et al.: Learning dense representations for entity retrieval. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), Hong Kong, China, pp. 528–537. Association for Computational Linguistics, November 2019. https://doi.org/10.18653/v1/K19-1049. https://aclanthology.org/K19-1049

  10. Hernandez, F., Nguyen, V., Ghannay, S., Tomashenko, N., Estève, Y.: TED-LIUM 3: twice as much data and corpus repartition for experiments on speaker adaptation. In: Karpov, A., Jokisch, O., Potapova, R. (eds.) SPECOM 2018. LNCS (LNAI), vol. 11096, pp. 198–208. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99579-3_21

    Chapter  Google Scholar 

  11. Humeau, S., Shuster, K., Lachaux, M.A., Weston, J.: Poly-encoders: architectures and pre-training strategies for fast and accurate multi-sentence scoring. In: ICLR (2020)

    Google Scholar 

  12. Ji, H., Grishman, R.: Knowledge base population: successful approaches and challenges. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Portland, Oregon, USA, pp. 1148–1158. Association for Computational Linguistics, June 2011

    Google Scholar 

  13. Ji, H., Nothman, J., Hachey, B., et al.: Overview of tac-kbp2014 entity discovery and linking tasks. In: Proceedings of the Text Analysis Conference (TAC2014), pp. 1333–1339 (2014)

    Google Scholar 

  14. Johnson, J., Douze, M., Jégou, H.: Billion-scale similarity search with GPUs. IEEE Trans. Big Data 7(3), 535–547 (2019)

    Article  Google Scholar 

  15. Joko, H., Hasibi, F., Balog, K., de Vries, A.P.: Conversational entity linking: problem definition and datasets. In: Proceedings of the 44rd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2021. ACM (2021)

    Google Scholar 

  16. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Bengio, Y., LeCun, Y. (eds.) 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7–9, 2015, Conference Track Proceedings (2015). http://arxiv.org/abs/1412.6980

  17. Lewis, M., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 7871–7880. Association for Computational Linguistics, July 2020. https://aclanthology.org/2020.acl-main.703

  18. Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Berlin, Germany, pp. 2124–2133. Association for Computational Linguistics, August 2016

    Google Scholar 

  19. Liu, X., Li, Y., Wu, H., Zhou, M., Wei, F., Lu, Y.: Entity linking for tweets. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Sofia, Bulgaria, pp. 1304–1311. Association for Computational Linguistics, August 2013. https://aclanthology.org/P13-1128

  20. Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach (2019)

    Google Scholar 

  21. Logeswaran, L., Chang, M.W., Lee, K., Toutanova, K., Devlin, J., Lee, H.: Zero-shot entity linking by reading entity descriptions. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp. 3449–3460. Association for Computational Linguistics, July 2019

    Google Scholar 

  22. Muralidharan, D., et al.: Noise robust named entity understanding for voice assistants. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers, pp. 196–204. Association for Computational Linguistics, June 2021. https://aclanthology.org/2021.naacl-industry.25

  23. Petroni, F., et al.: KILT: a benchmark for knowledge intensive language tasks. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 2523–2544. Association for Computational Linguistics, June 2021

    Google Scholar 

  24. Sun, T., et al.: CoLAKE: contextualized language and knowledge embedding. In: Proceedings of the 28th International Conference on Computational Linguistics, Barcelona, Spain, pp. 3660–3670. International Committee on Computational Linguistics, December 2020. https://aclanthology.org/2020.coling-main.327

  25. Vaswani, A., et al.: Attention is all you need. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc. (2017)

    Google Scholar 

  26. Wu, L., Petroni, F., Josifoski, M., Riedel, S., Zettlemoyer, L.: Scalable zero-shot entity linking with dense entity retrieval. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 6397–6407. Association for Computational Linguistics, November 2020

    Google Scholar 

  27. Yamada, I., Asai, A., Shindo, H., Takeda, H., Matsumoto, Y.: LUKE: deep contextualized entity representations with entity-aware self-attention. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 6442–6454. Association for Computational Linguistics, November 2020. https://aclanthology.org/2020.emnlp-main.523

  28. Yamada, I., Washio, K., Shindo, H., Matsumoto, Y.: Global entity disambiguation with BERT. In: NAACL. Association for Computational Linguistics (2022)

    Google Scholar 

  29. Zhang, W., Stratos, K.: Understanding hard negatives in noise contrastive estimation. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1090–1101. Association for Computational Linguistics, June 2021

    Google Scholar 

  30. Zhang, Z., Han, X., Liu, Z., Jiang, X., Sun, M., Liu, Q.: ERNIE: enhanced language representation with informative entities. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp. 1441–1451. Association for Computational Linguistics, July 2019

    Google Scholar 

Download references

Acknowledgements

We appreciate the insightful feedback from anonymous reviewers. This work is jointly supported by grants: National Science Foundation of China (No. 62006061), Strategic Emerging Industry Development Special Funds of Shenzhen (No. JCYJ20200109113441941), and Stable Support Program for Higher Education Institutions of Shenzhen (No. GXWD20201230155427003-20200824155011001).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Baotian Hu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xu, Z., Chen, Y., Shi, S., Hu, B. (2022). Enhancing Entity Linking with Contextualized Entity Embeddings. In: Lu, W., Huang, S., Hong, Y., Zhou, X. (eds) Natural Language Processing and Chinese Computing. NLPCC 2022. Lecture Notes in Computer Science(), vol 13552. Springer, Cham. https://doi.org/10.1007/978-3-031-17189-5_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-17189-5_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-17188-8

  • Online ISBN: 978-3-031-17189-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics