Skip to main content

Knowledge-Aware Self-Attention Networks for Document Grounded Dialogue Generation

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11776))

  • 1396 Accesses

Abstract

Dialogue systems have attracted more and more attention. Different from traditional open-domain conversational systems, document grounded dialogue generation aims to ground the semantics in a specified document and leverage contextual cues from dialogue history to generate on-topic and coherent responses, which is a renewed and challenging task. Some prior studies via neural sequence-to-sequence models have been conducted. However, they often treat the dialogue history and the given document independently while fail to model contextual dependence between them. To understand the dialogue better and respond more appropriately and informatively, we present a novel knowledge-aware self-attention approach for document grounded dialogue, called DialogTransformer. DialogTransformer can fully leverage the semantic knowledge from both the dialogue history and the given document to joint improve the content quality of generated responses. We conduct extensive experiments on the CMU-DoG benchmark dataset and the experimental results show that our approach outperforms several state-of-the-art models, which can generate more appropriate and informative responses.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/festvox/datasets-CMU_DoG.

  2. 2.

    https://github.com/tangxiangru/KAT.

References

  1. Zhou, K., Prabhumoye, S., Black, A.W.: A dataset for document grounded conversations (2018). arXiv preprint: arXiv:1809.07358

  2. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation (2014). arXiv preprint: arXiv:1406.1078

  3. Cho, K., Courville, A., Bengio, Y.: Describing multimedia content using attention-based encoder-decoder networks. IEEE Trans. Multimed. 17(11), 1875–1886 (2015)

    Article  Google Scholar 

  4. Vinyals, O., Le, Q.: A neural conversational model (2015). arXiv preprint: arXiv:1506.05869

  5. Serban, I.V., Sordoni, A., Bengio, Y., Courville, A., Pineau, J.: Building end-to-end dialogue systems using generative hierarchical neural network models. In: Thirtieth AAAI Conference on Artificial Intelligence, March 2016

    Google Scholar 

  6. Zhou, X., et al.: Multi-view response selection for human-computer conversation. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 372–381 (2016)

    Google Scholar 

  7. Chaudhuri, D., Kristiadi, A., Lehmann, J., Fischer, A.: Improving response selection in multi-turn dialogue systems (2018). arXiv preprint: arXiv:1809.03194

  8. Serban, I.V., Lowe, R., Henderson, P., Charlin, L., Pineau, J.: A survey of available corpora for building data-driven dialogue systems (2015). arXiv preprint: arXiv:1512.05742

  9. Chen, H., Liu, X., Yin, D., Tang, J.: A survey on dialogue systems: recent advances and new frontiers. ACM SIGKDD Explor. Newsl. 19(2), 25–35 (2017)

    Article  Google Scholar 

  10. Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation (2015). arXiv preprint: arXiv:1503.02364

  11. Li, J., Galley, M., Brockett, C., Gao, J., Dolan, B.: A diversity-promoting objective function for neural conversation models (2015). arXiv preprint: arXiv:1510.03055

  12. Sordoni, A., et al.: A neural network approach to context-sensitive generation of conversational responses (2015). arXiv preprint: arXiv:1506.06714

  13. Ghosh, S., Chollet, M., Laksana, E., Morency, L.P., Scherer, S.: Affect-LM: a neural language model for customizable affective text generation (2017). arXiv preprint: arXiv:1704.06851

  14. Zhou, H., Huang, M., Zhang, T., Zhu, X., Liu, B.: Emotional chatting machine: emotional conversation generation with internal and external memory. In: Thirty-Second AAAI Conference on Artificial Intelligence, April 2018

    Google Scholar 

  15. See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks (2017). arXiv preprint: arXiv:1704.04368

  16. Dinan, E., Roller, S., Shuster, K., Fan, A., Auli, M., Weston, J.: Wizard of wikipedia: knowledge-powered conversational agents (2018). arXiv preprint: arXiv:1811.01241

  17. Serban, I.V., et al.: A hierarchical latent variable encoder-decoder model for generating dialogues. In: Thirty-First AAAI Conference on Artificial Intelligence, February 2017

    Google Scholar 

  18. Xing, C., et al.: Topic aware neural response generation. In: Thirty-First AAAI Conference on Artificial Intelligence, February 2017

    Google Scholar 

  19. Ghazvininejad, M., et al.: A knowledge-grounded neural conversation model. In: Thirty-Second AAAI Conference on Artificial Intelligence, April 2018

    Google Scholar 

  20. Young, T., Cambria, E., Chaturvedi, I., Zhou, H., Biswas, S., Huang, M.: Augmenting end-to-end dialogue systems with commonsense knowledge. In: Thirty-Second AAAI Conference on Artificial Intelligence, April 2018

    Google Scholar 

  21. Zhou, H., Young, T., Huang, M., Zhao, H., Xu, J., Zhu, X.: Commonsense knowledge aware conversation generation with graph attention. In: IJCAI, pp. 4623–4629, July 2018

    Google Scholar 

  22. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  23. Liu, S., Chen, H., Ren, Z., Feng, Y., Liu, Q., Yin, D.: Knowledge diffusion for neural dialogue generation. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Long Papers, vol. 1, pp. 1489–1498, July 2018

    Google Scholar 

  24. Sun, M., Li, X., Li, P.: Logician and orator: learning from the duality between language and knowledge in open domain. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 2119–2130 (2018)

    Google Scholar 

  25. Mihaylov, T., Frank, A.: Knowledgeable reader: enhancing cloze-style reading comprehension with external commonsense knowledge (2018). arXiv preprint: arXiv:1805.07858

  26. Madotto, A., Wu, C.S., Fung, P.: Mem2Seq: effectively incorporating knowledge bases into end-to-end task-oriented dialog systems (2018). arXiv preprint: arXiv:1804.08217

  27. Chen, Q., Zhu, X., Ling, Z.H., Inkpen, D., Wei, S.: Neural natural language inference models enhanced with external knowledge (2017). arXiv preprint: arXiv:1711.04289

  28. Guan, J., Wang, Y., Huang, M.: Story ending generation with incremental encoding and commonsense knowledge (2018). arXiv preprint: arXiv:1808.10113

  29. Seo, M., Kembhavi, A., Farhadi, A., Hajishirzi, H.: Bidirectional attention flow for machine comprehension (2016). arXiv preprint: arXiv:1611.01603

Download references

Acknowledgement

This work was supported by the National Natural Science Foundation of China (No. 61402191), the Fundamental Research Funds for the Central Universities (No. CCNU18TS044), and the Thirteen Five-year Research Planning Project of National Language Committee (No. WT135-11).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Po Hu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tang, X., Hu, P. (2019). Knowledge-Aware Self-Attention Networks for Document Grounded Dialogue Generation. In: Douligeris, C., Karagiannis, D., Apostolou, D. (eds) Knowledge Science, Engineering and Management. KSEM 2019. Lecture Notes in Computer Science(), vol 11776. Springer, Cham. https://doi.org/10.1007/978-3-030-29563-9_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-29563-9_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-29562-2

  • Online ISBN: 978-3-030-29563-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics