Skip to main content

A Key-Phrase Aware End2end Neural Response Generation Model

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11839))

Abstract

Previous Seq2Seq models for chitchat assume that each word in the target sequence has direct corresponding relationship with words in the source sequence, and all the target words are equally important. However, it is invalid since sometimes only parts of the response are relevant to the message. For models with the above mentioned assumption, irrelevant response words might have a negative impact on the performance in semantic association modeling that is a core task for open-domain dialogue modeling. In this work, to address the challenge of semantic association modeling, we automatically recognize key-phrases from responses in training data, and then feed this supervision information into an enhanced key-phrase aware seq2seq model for better capability in semantic association modeling. This model consists of an encoder and a two-layer decoder, where the encoder and the first layer sub-decoder is mainly for learning semantic association and the second layer sub-decoder is for responses generation. Experimental results show that this model can effectively utilize the key phrase information for semantic association modeling, and it can significantly outperform baseline models in terms of response appropriateness and informativeness .

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    We annotated key-phrases for responses from randomly sampled 200 message-response pairs, extracted from Baidu Tieba. We found that there are 78% pairs in which non-key-phrases exist, or only a part of the response is relevant to the message.

  2. 2.

    https://tieba.baidu.com.

  3. 3.

    https://github.com/baidu/lac.

References

  1. Jiao, Z., Sun, S., Sun, K.: Chinese lexical analysis with deep bi-gru-crf network. arXiv preprint arXiv:1807.01882 (2018)

  2. Lei, W., Jin, X., Kan, M.Y., Ren, Z., He, X., Yin, D.: Sequicity: simplifying task-oriented dialogue systems with single sequence-to-sequence architectures. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 1437–1447 (2018)

    Google Scholar 

  3. Li, J., Galley, M., Brockett, C., Gao, J., Dolan, B.: A diversity-promoting objective function for neural conversation models. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 110–119 (2016)

    Google Scholar 

  4. Li, J., Galley, M., Brockett, C., Spithourakis, G., Gao, J., Dolan, B.: A persona-based neural conversation model. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 994–1003 (2016)

    Google Scholar 

  5. Lison, P., Bibauw, S.: Not all dialogues are created equal: instance weighting for neural conversational models. In: Proceedings of the 18th Annual SIGdial Meeting on Discourse and Dialogue, pp. 384–394 (2017)

    Google Scholar 

  6. Liu, C.W., Lowe, R., Serban, I., Noseworthy, M., Charlin, L., Pineau, J.: How not to evaluate your dialogue system: an empirical study of unsupervised evaluation metrics for dialogue response generation. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2122–2132 (2016)

    Google Scholar 

  7. Mou, L., Song, Y., Yan, R., Li, G., Zhang, L., Jin, Z.: Sequence to backward and forward sequences: a content-introducing approach to generative short-text conversation. In: Proceedings of COLING 2016, The 26th International Conference on Computational Linguistics: Technical Papers, pp. 3349–3358 (2016)

    Google Scholar 

  8. Serban, I.V., et al.: Multiresolution recurrent neural networks: an application to dialogue response generation. In: AAAI, pp. 3288–3294 (2017)

    Google Scholar 

  9. Serban, I.V., Sordoni, A., Bengio, Y., Courville, A.C., Pineau, J.: Building end-to-end dialogue systems using generative hierarchical neural network models. In: AAAI, vol. 16, pp. 3776–3784 (2016)

    Google Scholar 

  10. Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), vol. 1, pp. 1577–1586 (2015)

    Google Scholar 

  11. Shang, M., Fu, Z., Peng, N., Feng, Y., Zhao, D., Yan, R.: Learning to converse with noisy data: generation with calibration. In: IJCAI, pp. 4338–4344 (2018)

    Google Scholar 

  12. Shao, Y., Gouws, S., Britz, D., Goldie, A., Strope, B., Kurzweil, R.: Generating high-quality and informative conversation responses with sequence-to-sequence models. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 2210–2219 (2017)

    Google Scholar 

  13. Song, Y., Yan, R., Feng, Y., Zhang, Y., Zhao, D., Zhang, M.: Towards a neural conversation model with diversity net using determinantal point processes. In: AAAI (2018)

    Google Scholar 

  14. Sordoni, A., et al.: A neural network approach to context-sensitive generation of conversational responses. In: Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 196–205 (2015)

    Google Scholar 

  15. Tao, C., Gao, S., Shang, M., Wu, W., Zhao, D., Yan, R.: Get the point of my utterance! learning towards effective responses with multi-head attention mechanism. In: IJCAI, pp. 4418–4424 (2018)

    Google Scholar 

  16. Wu, X., et al.: Generalization of words for Chinese dependency parsing. In: Proceedings of IWPT 2013, pp. 73–81 (2013)

    Google Scholar 

  17. Xing, C., et al.: Topic aware neural response generation. In: AAAI. vol. 17, pp. 3351–3357 (2017)

    Google Scholar 

  18. Yao, L., Zhang, Y., Feng, Y., Zhao, D., Yan, R.: Towards implicit content-introducing for generative short-text conversation systems. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 2190–2199 (2017)

    Google Scholar 

  19. Zhang, R., Guo, J., Fan, Y., Lan, Y., Xu, J., Cheng, X.: Learning to control the specificity in neural response generation. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 1108–1117 (2018)

    Google Scholar 

  20. Zhou, G., Luo, P., Cao, R., Lin, F., Chen, B., He, Q.: Mechanism-aware neural machine for dialogue response generation. In: AAAI, pp. 3400–3407 (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wanxiang Che .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xu, J., Wang, H., Niu, Z., Wu, H., Che, W. (2019). A Key-Phrase Aware End2end Neural Response Generation Model. In: Tang, J., Kan, MY., Zhao, D., Li, S., Zan, H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2019. Lecture Notes in Computer Science(), vol 11839. Springer, Cham. https://doi.org/10.1007/978-3-030-32236-6_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32236-6_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32235-9

  • Online ISBN: 978-3-030-32236-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics