Abstract
Paraphrase generation has consistently been a challenging area in the field of NLP. Despite the considerable achievements made by previous research, existing supervised approaches require many annotated paraphrase pairs which are expensive to collect. On the other hand, unsupervised paraphrasing manners usually generate syntactically similar output compared with the source text and lack diversity in grammatical structure. To tangle this challenge, we propose a Transformer-based model applying an Adversarial approach for Unsupervised Syntax-Guided Paraphrase Generation (AUSPG). AUSPG is based on a combination of syntax discriminator and Transformer framework to paraphrase sentences from disentangled semantic and syntactic spaces without the need for annotated pairs. More specifically, we deploy a Transformer encoder without position embedding to obtain semantic representations. The syntax discriminator is utilized to further regularize the semantic space. In addition, the disentanglement enables AUSPG to manipulate the embedding of syntactic space to generate syntax-guided paraphrases. Finally, we conduct extensive experiments to substantiate the validity and effectiveness of our proposal. The results reveal that AUSPG significantly outperforms the existing baselines and generates more diverse paraphrase sentences.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Banerjee, S., Lavie, A.: Meteor: an automatic metric for MT evaluation with improved correlation with human judgments. In: Proceedings of the ACL Workshop on Intrinsic and Extrinsic Evaluation Measures for Machine Translation and/or Summarization (2005)
Bao, Y., et al: Generating sentences from disentangled syntactic and semantic spaces. In: Proceedings of ACL (2019)
Barzilay, R., Lee, L.: Learning to paraphrase: an unsupervised approach using multiple-sequence alignment. In: Proceedings of NAACL (2003)
Bowman, S., Vilnis, L., Vinyals, O., Dai, A., Jozefowicz, R., Bengio, S.: Generating sentences from a continuous space. In: Proceedings of CoNLL (2016)
Chen, M., Tang, Q., Wiseman, S., Gimpel, K.: Controllable paraphrase generation with a syntactic exemplar. In: Proceedings of ACL (2019)
Dolan, B., Quirk, C., Brockett, C.: Unsupervised construction of large paraphrase corpora: exploiting massively parallel news sources. In: Proceedings of COLING (2004)
Egonmwan, E., Chali, Y.: Transformer and seq2seq model for paraphrase generation. In: Proceedings of the 3rd Workshop on Neural Generation and Translation (2019)
Fu, Y., Feng, Y., Cunningham, J.P.: Paraphrase generation with latent bag of words. Proceedings of NeurIPS (2019)
Gao, S., Zhang, Y., Ou, Z., Yu, Z.: Paraphrase augmented task-oriented dialog generation. In: Proceedings of ACL (2020)
Goyal, T., Durrett, G.: Neural syntactic preordering for controlled paraphrase generation. In: Proceedings of ACL (2020)
Hu, J.E., Rudinger, R., Post, M., Van Durme, B.: ParaBank: monolingual bitext generation and sentential paraphrasing via lexically-constrained neural machine translation. In: Proceedings of AAAI (2019)
Huang, K.H., Chang, K.W.: Generating syntactically controlled paraphrases without using annotated parallel pairs. In: Proceedings of EACL (2021)
Iyyer, M., Wieting, J., Gimpel, K., Zettlemoyer, L.: Adversarial example generation with syntactically controlled paraphrase networks. In: Proceedings of NAACL (2018)
Kauchak, D., Barzilay, R.: Paraphrasing for automatic evaluation. In: Proceedings of AACL (2006)
Kumar, A., Ahuja, K., Vadapalli, R., Talukdar, P.: Syntax-guided controlled generation of paraphrases. Trans. Assoc. Comput. Linguist. (2020)
Li, J., Monroe, W., Ritter, A., Jurafsky, D., Galley, M., Gao, J.: Deep reinforcement learning for dialogue generation. In: EMNLP (2016)
Li, Z., Jiang, X., Shang, L., Li, H.: Paraphrase generation with deep reinforcement learning. In: Proceedings of EMNLP (2018)
Lin, Z., Wan, X.: Pushing paraphrase away from original sentence: a multi-round paraphrase generation approach. In: Proceedings of ACL Findings (2021)
Liu, X., Mou, L., Meng, F., Zhou, H., Zhou, J., Song, S.: Unsupervised paraphrasing by simulated annealing. In: Proceedings of ACL (2020)
Ma, S., Sun, X., Wang, Y., Lin, J.: Bag-of-words as target for neural machine translation. In: Proceedings of ACL (2018)
Madnani, N., Tetreault, J., Chodorow, M.: Re-examining machine translation metrics for paraphrase identification. In: Proceedings of NAACL (2012)
Mallinson, J., Sennrich, R., Lapata, M.: Paraphrasing revisited with neural machine translation. In: Proceedings of EACL (2017)
McKeown, K.: Paraphrasing questions using given and new information. Am. J. Comput. Linguist. (1983)
Meng, Y., et al.: ConRPG: paraphrase generation using contexts as regularizer. In: Proceedings of EMNLP (2021)
Prakash, A., et al.: Neural paraphrase generation with stacked residual LSTM networks. In: Proceedings of COLING (2016)
Qian, L., Qiu, L., Zhang, W., Jiang, X., Yu, Y.: Exploring diverse expressions for paraphrase generation. In: Proceedings of EMNLP (2019)
Ranzato, M., Chopra, S., Auli, M., Zaremba, W.: Sequence level training with recurrent neural networks. In: Proceedings of ICLR (2016)
Roy, A., Grangier, D.: Unsupervised paraphrasing without translation. In: Proceedings of ACL (2019)
Siddique, A., Oymak, S., Hristidis, V.: Unsupervised paraphrasing via deep reinforcement learning. In: Proceedings of KDD (2020)
Sun, J., Ma, X., Peng, N.: Aesop: Paraphrase generation with adaptive syntactic control. In: Proceedings of EMNLP (2021)
Tao, C., Gao, S., Li, J., Feng, Y., Zhao, D., Yan, R.: Learning to organize a bag of words into sentences with neural networks: an empirical study. In: Proceedings of NAACL (2021)
Vaswani, A., et al.: Attention is all you need. In: Proceedings of NeurIPS (2017)
Wieting, J., Gimpel, K.: Paranmt-50m: pushing the limits of paraphrastic sentence embeddings with millions of machine translations. In: Proceedings of ACL (2018)
Witteveen, S., Andrews, M.: Paraphrasing with large language models. In: Proceedings of the 3rd Workshop on Neural Generation and Translation (2019)
Wubben, S., Van Den Bosch, A., Krahmer, E.: Paraphrase generation as monolingual translation: Data and evaluation. In: Proceedings of the 6th International Natural Language Generation Conference (2010)
Yang, X., Liu, Y., Xie, D., Wang, X., Balasubramanian, N.: Latent part-of-speech sequences for neural machine translation. In: Proceedings of EMNLP (2019)
Yuan, W., Ding, L., Meng, K., Liu, G.: Text generation with syntax - enhanced variational autoencoder. In: International Joint Conference on Neural Networks, IJCNN 2021, Shenzhen, China, 18–22 July 2021, pp. 1–8. IEEE (2021). https://doi.org/10.1109/IJCNN52387.2021.9533865
Zhang, X., Yang, Y., Yuan, S., Shen, D., Carin, L.: Syntax-infused variational autoencoder for text generation. In: Proceedings of ACL (2019)
Zhao, S., Meng, R., He, D., Saptono, A., Parmanto, B.: Integrating transformer and paraphrase rules for sentence simplification. In: Proceedings of EMNLP (2018)
Zhao, S., Wang, H., Lan, X., Liu, T.: Leveraging multiple MT engines for paraphrase generation. In: Proceedings of COLING (2010)
Acknowledgment
This work was supported by the Joint Funds of the National Natural Science Foundation of China (Grant No. U21B2020). Gongshen Liu is the corresponding author.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Xue, T., Zhao, Y., Liu, G., Li, X. (2022). An Adversarial Approach for Unsupervised Syntax-Guided Paraphrase Generation. In: Lu, W., Huang, S., Hong, Y., Zhou, X. (eds) Natural Language Processing and Chinese Computing. NLPCC 2022. Lecture Notes in Computer Science(), vol 13551. Springer, Cham. https://doi.org/10.1007/978-3-031-17120-8_29
Download citation
DOI: https://doi.org/10.1007/978-3-031-17120-8_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-17119-2
Online ISBN: 978-3-031-17120-8
eBook Packages: Computer ScienceComputer Science (R0)