skip to main content
10.1145/3603165.3607426acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesacm-turcConference Proceedingsconference-collections
poster

Multi-language Reverse Dictionary Model Based on Improved mBERT

Published:25 September 2023Publication History

ABSTRACT

A reverse dictionary generates a ranked list of vocabulary words that correspond to the definition of a given input description. Although reverse dictionary has widely practical values, little research has been done, particularly on multilingual reverse dictionary. To address this gap and enhance the accuracy of reverse dictionary across different languages, this paper proposes a multilingual reverse dictionary model based on mBERT. It optimizes the original model with features such as part-of-speech of words. The effectiveness of this improved model has been validated on both English and Chinese datasets. Experimental results illustrate that our model outperforms the baseline models in most metrics.

References

  1. L. Zhang, F. Qi, Z. Liu, Y. Wang, Q. Liu, and M. Sun, ‘Multi-channel reverse dictionary model’, in Proceedings of the AAAI conference on artificial intelligence, 2020, pp. 312–319.Google ScholarGoogle ScholarCross RefCross Ref
  2. J. D. M.-W. C. Kenton and L. K. Toutanova, ‘BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding’, in Proceedings of NAACL-HLT, 2019, pp. 4171–4186.Google ScholarGoogle Scholar
  3. A. Conneau , ‘Unsupervised Cross-lingual Representation Learning at Scale’, in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020, pp. 8440–8451.Google ScholarGoogle Scholar
  4. C. Raffel , ‘Exploring the limits of transfer learning with a unified text-to-text transformer’, The Journal of Machine Learning Research, vol. 21, no. 1, pp. 5485–5551, 2020.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. H. Yan, X. Li, X. Qiu, and B. Deng, ‘BERT for Monolingual and Cross-Lingual Reverse Dictionary’, in Findings of the Association for Computational Linguistics: EMNLP 2020, 2020, pp. 4329–4338.Google ScholarGoogle ScholarCross RefCross Ref

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    ACM TURC '23: Proceedings of the ACM Turing Award Celebration Conference - China 2023
    July 2023
    173 pages
    ISBN:9798400702334
    DOI:10.1145/3603165

    Copyright © 2023 Owner/Author

    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 25 September 2023

    Check for updates

    Qualifiers

    • poster
    • Research
    • Refereed limited
  • Article Metrics

    • Downloads (Last 12 months)38
    • Downloads (Last 6 weeks)6

    Other Metrics

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format