skip to main content
10.1145/3650215.3650288acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlcaConference Proceedingsconference-collections
research-article

Transformer-based a Automatic Scoring Model for Translation Jobs

Published:16 April 2024Publication History

ABSTRACT

In the field of natural language processing, evaluating translation quality has been a long-standing challenge. Traditionally, the assessment of translation quality has relied heavily on manual assessment, which can be time-consuming, subjective and resource-intensive. These limitations spur the need for automated methods to streamline processes and increase objectivity and efficiency. To solve this problem, this paper introduces an automatic scoring system based on the Transformer model for evaluating the quality of translation tasks. The Transformer is a powerful sequence-to-sequence model that has achieved significant success in the field of natural language processing. By using the Transformer model, we can transform the translation task into a machine learning problem and automatically analyze the differences between the target language and the reference language. The training process of this automatic scoring system involves using a large amount of parallel corpora for data training, enabling the model to learn effective translation rules and semantic representations. In the testing phase, the system can accept the translation text to be evaluated and assign it an automatic score, representing the quality of its translation. This automatic scoring system can improve the efficiency and accuracy of translation tasks while reducing reliance on manual evaluation.

References

  1. Stahlberg F. 2020. Neural machine translation: A review. Journal of Artificial Intelligence Research, Vol. 69: 343-418.Google ScholarGoogle ScholarCross RefCross Ref
  2. Ranathunga S, Lee E S A, Prifti Skenduli M, 2023. Neural machine translation for low-resource languages: A survey. ACM Computing Surveys, Vol. 55(11): 1-37.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Lopez A. 2008. Statistical machine translation. ACM Computing Surveys (CSUR), 40(3): 1-49.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Maruf S, Saleh F, Haffari G. 2021. A survey on document-level neural machine translation: Methods and evaluation. ACM Computing Surveys (CSUR), Vol. 54(2): 1-36.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Saunders D. 2022. Domain adaptation and multi-domain adaptation for neural machine translation: A survey. Journal of Artificial Intelligence Research, Vol. 75: 351-424.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Xiao Y, Wu L, Guo J, 2023. A survey on non-autoregressive generation for neural machine translation and beyond. IEEE Transactions on Pattern Analysis and Machine Intelligence.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Li L, Tayir T, Han Y, 2023. Multimodality information fusion for automated machine translation. Information Fusion, Vol.91: 352-363.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Lyu X, Li J, Zhang M, 2022. Refining History for Future-Aware Neural Machine Translation. IEEE/ACM Transactions on Audio, Speech, and Language Processing, Vol. 31: 500-512.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Almusharraf, A., & Bailey, D. 2023. Machine translation in language acquisition: A study on EFL students' perceptions and practices in Saudi Arabia and South Korea. Journal of Computer Assisted Learning. https://onlinelibrary.wiley.com/doi/10.1111/jcal.12857.Google ScholarGoogle ScholarCross RefCross Ref
  10. Lee, S.-M. 2018. The impact of using machine translation on EFL students’ writing. Computer Assisted Language Learning, 33(3). https://www.tandfonline.com/doi/abs/10.1080/09588221.2018.1553186.Google ScholarGoogle Scholar
  11. Lee, M. 2019. The impact of using machine translation on EFL students’ writing. Computer Assisted Language Learning, 33(2), 1-19. DOI:10.1080/09588221.2018.1553186.Google ScholarGoogle ScholarCross RefCross Ref
  12. Klimova, B., Pikhart, M., Benites, A.D., 2023. Neural machine translation in foreign language teaching and learning: a systematic review. Educ Inf Technol, 28(1), 663–682. https://doi.org/10.1007/s10639-022-11194-2.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Transformer-based a Automatic Scoring Model for Translation Jobs

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICMLCA '23: Proceedings of the 2023 4th International Conference on Machine Learning and Computer Application
      October 2023
      1065 pages
      ISBN:9798400709449
      DOI:10.1145/3650215

      Copyright © 2023 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 16 April 2024

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited
    • Article Metrics

      • Downloads (Last 12 months)3
      • Downloads (Last 6 weeks)3

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format