Abstract
Recent advances of deep learning have been successful in delivering state-of-the-art performance in medical analysis, However, deep neural networks (DNNs) require a large amount of training data with a high-quality annotation which is not available or expensive in the field of the medical domain. The research of medical domain neural machine translation (NMT) is largely limited due to the lack of parallel sentences that consist of medical domain background knowledge annotations. To this end, we propose a Chinese-Uyghur NMT knowledge-driven dataset, YuQ, which refers to a ground medical domain knowledge graphs. Our corpus 65K parallel sentences from the medical domain 130K utterances. By introduce medical domain glossary knowledge to the training model, we can win the challenge of low translation accuracy in Chinese-Uyghur machine translation professional terms. We provide several benchmark models. Ablation study results show that the models can be enhanced by introducing domain knowledge.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv Computation and Language (2014)
Bengio, Y., Ducharme, R., Vincent, P., Jauvin, C.: A neural probabilistic language model. J. Mach. Learn. Res. 3(Feb), 1137–1155 (2003)
Chen, K., Wang, R., Utiyama, M., Sumita, E.: Content word aware neural machine translation. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 358–364 (2020)
Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: NAACL 2019, pp. 4171–4186 (2019)
Dinan, E., Roller, S., Shuster, K., Fan, A., Auli, M., Weston, J.: Wizard of Wikipedia: knowledge-powered conversational agents. arXiv preprint arXiv:1811.01241 (2018)
Duan, X., et al.: Bilingual dictionary based neural machine translation without using parallel sentences. arXiv preprint arXiv:2007.02671 (2020)
Ghazvininejad, M., et al.: A knowledge-grounded neural conversation model. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
Gopalakrishnan, K., et al.: Topical-chat: towards knowledge-grounded open-domain conversations. In: INTERSPEECH, pp. 1891–1895 (2019)
Hao, J., Wang, X., Shi, S., Zhang, J., Tu, Z.: Multi-granularity self-attention for neural machine translation. arXiv preprint arXiv:1909.02222 (2019)
He, H., Balakrishnan, A., Eric, M., Liang, P.: Learning symmetric collaborative dialogue agents with dynamic knowledge graph embeddings. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1766–1776 (2017)
Li, J., Galley, M., Brockett, C., Gao, J., Dolan, B.: A diversity-promoting objective function for neural conversation models. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 110–119 (2016)
Long, Y., Wang, J., Xu, Z., Wang, Z., Wang, B., Wang, Z.: A knowledge enhanced generative conversational service agent. In: Proceedings of the 6th Dialog System Technology Challenges (DSTC6) Workshop (2017)
Miller, A., Fisch, A., Dodge, J., Karimi, A.H., Bordes, A., Weston, J.: Key-value memory networks for directly reading documents. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 1400–1409 (2016)
Moghe, N., Arora, S., Banerjee, S., Khapra, M.M.: Towards exploiting background knowledge for building conversation systems. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 2322–2332 (2018)
Moon, S., Shah, P., Kumar, A., Subba, R.: Opendialkg: explainable conversational reasoning with attention-based walks over knowledge graphs. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 845–854 (2019)
Parthasarathi, P., Pineau, J.: Extending neural generative conversational model using external knowledge sources. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 690–695 (2018)
Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 1577–1586 (2015)
Shao, L., Gouws, S., Britz, D., Goldie, A., Strope, B., Kurzweil, R.: Generating long and diverse responses with neural conversation models (2016)
Sokolov, A., Filimonov, D.: Neural machine translation for paraphrase generation. arXiv preprint arXiv:2006.14223 (2020)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)
Vaswani, A., et al.: Attention is all you need, pp. 5998–6008 (2017)
Wen, T.-H., Gasic, M., Mrksic, N., Su, P.-H., Vandyke, D., Young, S.: Semantically conditioned LSTM-based natural language generation for spoken dialogue systems. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1711–1721 (2015)
Wu, W., et al.: Proactive human-machine conversation with explicit conversation goal. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 3794–3804 (2019)
Young, T., Cambria, E., Chaturvedi, I., Huang, M., Zhou, H., Biswas, S.: Augmenting end-to-end dialogue systems with commonsense knowledge. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
Zhang, J., et al.: Thumt: an open source toolkit for neural machine translation. arXiv preprint arXiv:1706.06415 (2017)
Zhang, Z., Han, X., Liu, Z., Jiang, X., Sun, M., Liu, Q.: ERNIE: enhanced language representation with informative entities. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1441–1451 (2019)
Zhou, H., Huang, M., Zhu, X.: Context-aware natural language generation for spoken dialogue systems. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 2032–2041 (2016)
Zhou, H., Young, T., Huang, M., Zhao, H., Xu, J., Zhu, X.: Commonsense knowledge aware conversation generation with graph attention. In: IJCAI, pp. 4623–4629 (2018a)
Zhou, K., Prabhumoye, S., Black, A.W.: A dataset for document grounded conversations. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 708–713 (2018b)
Zhou, H., Zheng, C., Huang, K., Huang, M., Zhu, X.: KdConv: a Chinese multi-domain dialogue dataset towards multi-turn knowledge-driven conversation. arXiv preprint arXiv:2004.04100 (2020)
Zhu, W., Mo, K., Zhang, Y., Zhu, Z., Peng, X., Yang, Q.: Flexible end-to-end dialogue system for knowledge grounded conversation. arXiv, pp. arXiv-1709 (2017)
Acknowledgments
We thank the anonymous reviewers for their valuable feedback. Qing Yu and Zhe Li are contributed equally to this research. This paper support by National Natural Science Foundation of China Research on the Construction of Chinese and Uygur Medical and Health Terms Resource Database Grant Number 61562082, and National Natural Science Foundation of China Research on Key Technologies of Uyghur-Chinese Phonetic Translation System Grant Number U1603262, Xinjiang Uygur Autonomous Region Graduate Research and Innovation Project Grant Number XJ2020G071, Dark Web Intelligence Analysis and User Identification Technology Grant Number 2017YFC0820702-3, and funded by National Engineering Laboratory for Public Safety Risk Perception and Control by Big Data (PSRPC).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yu, Q., Li, Z., Sheng, J., Sun, J., Slamu, W. (2020). YuQ: A Chinese-Uyghur Medical-Domain Neural Machine Translation Dataset Towards Knowledge-Driven. In: Li, J., Way, A. (eds) Machine Translation. CCMT 2020. Communications in Computer and Information Science, vol 1328. Springer, Singapore. https://doi.org/10.1007/978-981-33-6162-1_4
Download citation
DOI: https://doi.org/10.1007/978-981-33-6162-1_4
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-33-6161-4
Online ISBN: 978-981-33-6162-1
eBook Packages: Computer ScienceComputer Science (R0)