Abstract
Most EBMT systems select the best example scored by the similarity between the input sentence and existing examples. However, there is still much matching and mutual-translation information unexplored from examples. This paper introduces log-linear translation model into EBMT in order to adequately incorporate different kinds of features inherited in the translation examples. Instead of designing translation model by human intuition, this paper formally constructs a multi-dimensional feature space to include various features of different aspects. In the experiments, the proposed model shows significantly better result.
Keywords
Sponsored by the National Natural Science Foundation of China(60375019).
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Makoto, N.: A framework of a mechanical translation between Japanese and English by analogy principle. In: Elithorn, A., Banerji, R. (eds.) Artificial and Human Intelligence, pp. 173–180 (1984)
Brown, P.F., Stephen, A.D.P., Vicent, J.D.P., Robert, L.M.: The mathematics of statistical machine translation: Parameter estimation. Computational Linguistics 19(2) (1993)
Nirenburg, S., Beale, S., Domashnev, C.: A full-text experiment in example-based machine translation. In: Int’l. Conf. on New Methods in Language Processing Manchester, pp. 78–87 (1994)
Brown, R.D.: Adding Linguistic Knowledge to a Lexical Example-Based Translation Ayatem. In: Proceedings of the 8th International Conference on Theoretical and Methodological Issues in Machine Translation (1999)
Macklovitch, E., Russell, G.: What’s been Forgotten in Translation Memory. In: Proceedings of the Conference of Association for the Machine Translation in Americas (2000)
Watanabe, H., Kurohashi, S., Aramaki, E.: Finding Steuctural Correspondences from Bilingual Parsed Corpus for Corpus-based Translation. In: Proceedings of the 18th International Conference on Computational Linguistics (2000)
Liu, Z., Wang, H., Wu, H.: Example-based Machine Translation Based on TSC and Statistical Generation. In: Proceedings of MT Summit X (2005)
Adam, L.B., Stephen, A.D.P., Vincent, J.D.P.: A maximum entropy approach to natural language processing. Computational Linguistics 22(1), 39–72 (1996)
Eiji, A., Sadao, K., Hideki, K., Hideki, T.: Word selection for ebmt based on monolingual similarity and translation confidence. In: Proceedings of the HLT-NAACL 2003 Workshop on Building and Using Parallel Texts: Data Driven Machine Translation and Beyond, pp. 57–64 (2003)
Osamu, F., Hitoshi, I.: Constituent boundary parsing for example-based machine translation. In: Proceedings of the 15th COLING, pp. 105–111 (1994)
Kenji, I.: Application of translation knowledgeacquired by hierarchical phrase alignment for pattern-based mt. In: Proceedings of TMI 2002, pp. 74–84 (2002)
Stephen, D.R., William, B.D., Arul, M., Monica, C.O.: Overcoming the customization bottleneck using example-based mt. In: Proceedings of the ACL 2001 Workshop on Data-Driven Methods in Machine Translation, pp. 9–16 (2001)
Papineni, K.A., Roukos, S., Ward, R.T.: Feature-based language understanding. In: European Conf. on Speech Communication and Technology, Rhodes, Greece, September 1997, pp. 1435–1438 (1997)
Papineni, K.A., Roukos, S., Ward, R.T.: Maximum likelihood and discriminative training of direct translation models. In: Proc. Int. Conf. on Acoustics, Speech, and Signal Processing, Seattle, WA, May 18, pp. 189–192 (1998)
Franz, J.O., Hermann, N.: Discriminative training and maximum entropy models for statistical machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics (ACL), Philadelphia, PA, July 2002, pp. 295–302 (2002)
Darroch, J.N., Ratcliff, D.: Generalized iterative scaling for log-linear models. Annals of Mathematical Statistics 43, 1470–1480 (1972)
Cardie, C.: Automating Feature Set Selection for Case-Based Learning of Linguistic Knowledge. In: Proc. of the Conference on Empirical Methods in Natural Language Processing, University of Pennsylvania, Philadelphia, USA (1996)
Liu, Z.: Research and Realization of Example Based Machine Translationon. Master Thesis of Harbin Institute of Technology (2003)
Papineni, K., Roukos, S., Ward, T., Zhu, W.: BLEU: a Method for Automatic Evaluation of machine translation. In: Proc. of the 40th Annual Conf. of the Association for Computational Linguistics (ACL 2002), Philadelphia, PA, pp. 311–318 (2002)
Eiji, A., Sadao, K., Hideki, K., Naoto, K.: Probabilistic Model for Example-based Machine Translation. In: Proceedings of MT Summit X (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chen, Y., Yang, M., Li, S., Jiang, H. (2006). Feature Rich Translation Model for Example-Based Machine Translation. In: Matsumoto, Y., Sproat, R.W., Wong, KF., Zhang, M. (eds) Computer Processing of Oriental Languages. Beyond the Orient: The Research Challenges Ahead. ICCPOL 2006. Lecture Notes in Computer Science(), vol 4285. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11940098_36
Download citation
DOI: https://doi.org/10.1007/11940098_36
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-49667-0
Online ISBN: 978-3-540-49668-7
eBook Packages: Computer ScienceComputer Science (R0)