Skip to main content

ABCD: Analogy-Based Controllable Data Augmentation

  • Conference paper
  • First Online:
  • 250 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 13082))

Abstract

We propose an analogy-based data augmentation approach for sentiment and style transfer named Analogy-Based Controllable Data Augmentation (ABCD). The object of data augmentation is to expand the number of sentences based on a limited amount of available data. We are given two unpaired corpora with different styles. In data augmentation, we retain the original text style while changing words to generate new sentences. We first train a self-attention-based convolutional neural network to compute the distribution of the contribution of each word to style in a given sentence. We call the words with high style contribution style-characteristic words. By substituting content words and style-characteristic words separately, we generate two new sentences. We use an analogy between the original sentence and these two additional sentences to generate another sentence. The results show that our proposed approach decrease perplexity by about 4 points and outperforms baselines on three transfer datasets.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Dai, N., Liang, J., Qiu, X., Huang, X.: Style transformer: unpaired text style transfer without disentangled latent representation. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 5997–6007. Association for Computational Linguistics, Florence, Italy, July 2019. https://doi.org/10.18653/v1/P19-1601

  2. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 4171–4186. Association for Computational Linguistics, Minneapolis, June 2019. https://doi.org/10.18653/v1/N19-1423

  3. Heafield, K.: KenLM: faster and smaller language model queries. In: Proceedings of the Sixth Workshop on Statistical Machine Translation, pp. 187–197. Association for Computational Linguistics, Edinburgh, July 2011

    Google Scholar 

  4. Holtzman, A., Buys, J., Du, L., Forbes, M., Choi, Y.: The curious case of neural text degeneration. In: International Conference on Learning Representations (2020)

    Google Scholar 

  5. Kingma, P.D., Ba, L.J.: Adam: a method for stochastic optimization. In: International Conference on Learning Representations (2015)

    Google Scholar 

  6. Lepage, Y.: Solving analogies on words: an algorithm. In: 36th Annual Meeting of the Association for Computational Linguistics and 17th International Conference on Computational Linguistics, vol. 1, pp. 728–734. Association for Computational Linguistics, Montreal, August 1998. https://doi.org/10.3115/980845.980967

  7. Lepage, Y.: Semantico-formal resolution of analogies between sentences. Proceedings of LTC, pp. 57–61 (2019)

    Google Scholar 

  8. Lepage, Y., Ando, S.I.: Saussurian analogy: a theoretical account and its application. In: COLING 1996 Volume 2: The 16th International Conference on Computational Linguistics (1996)

    Google Scholar 

  9. Li, J., Jia, R., He, H., Liang, P.: Delete, retrieve, generate: a simple approach to sentiment and style transfer. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pp. 1865–1874. Association for Computational Linguistics, New Orleans, June 2018. https://doi.org/10.18653/v1/N18-1169

  10. Li, X., Chen, G., Lin, C., Li, R.: DGST: a dual-generator network for text style transfer. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, November 2020

    Google Scholar 

  11. Luo, F., et al.: A dual reinforcement learning framework for unsupervised text style transfer. In: Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, IJCAI-19, pp. 5116–5122. International Joint Conferences on Artificial Intelligence Organization, July 2019. https://doi.org/10.24963/ijcai.2019/711

  12. Maas, A.L., Daly, R.E., Pham, P.T., Huang, D., Ng, A.Y., Potts, C.: Learning word vectors for sentiment analysis. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 142–150. Association for Computational Linguistics, Portland, June 2011

    Google Scholar 

  13. Pavlick, E., Nenkova, A.: Inducing lexical style properties for paraphrase and genre differentiation. In: Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 218–224. Association for Computational Linguistics, Denver, May-June 2015. https://doi.org/10.3115/v1/N15-1023

  14. Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. OpenAI Blog 1(8), 9 (2019)

    Google Scholar 

  15. Socher, R., et al.: Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1631–1642. Association for Computational Linguistics, Seattle, October 2013

    Google Scholar 

  16. Vaswani, A., et al.: Attention is all you need. In: NIPS (2017)

    Google Scholar 

  17. Wang, L., Lepage, Y.: Vector-to-sequence models for sentence analogies. In: 2020 International Conference on Advanced Computer Science and Information Systems (ICACSIS), pp. 441–446 (2020). https://doi.org/10.1109/ICACSIS51025.2020.9263191

  18. Wei, J., Zou, K.: EDA: easy data augmentation techniques for boosting performance on text classification tasks. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 6382–6388. Association for Computational Linguistics, Hong Kong, November 2019. https://doi.org/10.18653/v1/D19-1670

  19. Williams, R.J.: Simple statistical gradient-following algorithms for connectionist reinforcement learning. Mach. Learn. 8(3–4), 229–256 (1992)

    Article  MATH  Google Scholar 

  20. Xie, Z., et al.: Data noising as smoothing in neural network language models. In: 5th International Conference on Learning Representations, ICLR 2017, January 2019

    Google Scholar 

  21. Xu, J., et al.: Unpaired sentiment-to-sentiment translation: a cycled reinforcement learning approach. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 979–988. Association for Computational Linguistics, Melbourne, July 2018. https://doi.org/10.18653/v1/P18-1090

  22. Yu, L., Zhang, W., Wang, J., Yu, Y.: SeqGAN: sequence generative adversarial nets with policy gradient. In: AAAI (2017)

    Google Scholar 

  23. Zhang, X., Zhao, J.J., LeCun, Y.: Character-level convolutional networks for text classification. In: NIPS (2015)

    Google Scholar 

  24. Zhang, Y., Ge, T., Sun, X.: Parallel data augmentation for formality style transfer. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 3221–3228. Association for Computational Linguistics, July 2020. https://doi.org/10.18653/v1/2020.acl-main.294

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shuo Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, S., Lepage, Y. (2021). ABCD: Analogy-Based Controllable Data Augmentation. In: Aranha, C., Martín-Vide, C., Vega-Rodríguez, M.A. (eds) Theory and Practice of Natural Computing. TPNC 2021. Lecture Notes in Computer Science(), vol 13082. Springer, Cham. https://doi.org/10.1007/978-3-030-90425-8_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-90425-8_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-90424-1

  • Online ISBN: 978-3-030-90425-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics