Skip to main content

Memory-Based Model with Multiple Attentions for Multi-turn Response Selection

  • Conference paper
  • First Online:
Book cover Neural Information Processing (ICONIP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11302))

Included in the following conference series:

  • 2233 Accesses

Abstract

In this paper, we study the task of multi-turn response selection in retrieval-based dialogue systems. Previous approaches focus on matching response with utterances in the context to distill important matching information, and modeling sequential relationship among utterances. This kind of approaches do not take into account the position relationship and inner semantic relevance between utterances and query (i.e., the last utterance). We propose a memory-based network (MBN) to build the effective memory integrating position relationship and inner semantic relevance between utterances and query. Then we adopt multiple attentions on the memory to learn representations of context with multiple levels, which is similar to the behavior of human that repetitively think before response. Experimental results on a public data set for multi-turn response selection show the effectiveness of our MBN model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://tcci.ccf.org.cn/conference/2018/dldoc/taskgline05.pdf.

  2. 2.

    http://weibo.com/.

References

  1. Chen, P., Sun, Z., Bing, L., Yang, W.: Recurrent attention network on memory for aspect sentiment analysis. In: EMNLP, pp. 452–461 (2017)

    Google Scholar 

  2. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)

  3. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  4. Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: NIPS, pp. 2042–2050 (2014)

    Google Scholar 

  5. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv:1412.6980 (2014)

  6. LeCun, Y., et al.: Backpropagation applied to handwritten zip code recognition. Neural Comput. 1(4), 541–551 (1989)

    Article  Google Scholar 

  7. Li, F.L., et al.: AliMe assist: an intelligent assistant for creating an innovative e-commerce experience. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp. 2495–2498. ACM (2017)

    Google Scholar 

  8. Logeswaran, L., Lee, H., Radev, D.: Sentence ordering and coherence modeling using recurrent neural networks. arXiv:1611.02654 (2018)

  9. Lowe, R., Pow, N., Serban, I., Pineau, J.: The Ubuntu dialogue corpus: a large dataset for research in unstructured multi-turn dialogue systems. arXiv:1506.08909 (2015)

  10. Lu, X., Lan, M., Wu, Y.: Memory-based matching models for multi-turn response selection in retrieval-based chatbots. In: NLPCC, pp. 294–303 (2018)

    Chapter  Google Scholar 

  11. Lu, Z., Li, H.: A deep architecture for matching short texts. In: NIPS, pp. 1367–1375 (2013)

    Google Scholar 

  12. Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS, pp. 3111–3119 (2013)

    Google Scholar 

  13. Serban, I.V., Sordoni, A., Bengio, Y., Courville, A.C., Pineau, J.: Building end-to-end dialogue systems using generative hierarchical neural network models. In: AAAI, vol. 16, pp. 3776–3784 (2016)

    Google Scholar 

  14. Serban, I.V., et al.: A hierarchical latent variable encoder-decoder model for generating dialogues. In: AAAI, pp. 3295–3301 (2017)

    Google Scholar 

  15. Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. In: ACL, vol. 1, pp. 1577–1586 (2015)

    Google Scholar 

  16. Shum, H.Y., He, X., Li, D.: From Eliza to Xiaoice: challenges and opportunities with social chatbots. arXiv preprint arXiv:1801.01957 (2018)

  17. Sordoni, A., et al.: A neural network approach to context-sensitive generation of conversational responses. In: NAACL, pp. 196–205 (2015)

    Google Scholar 

  18. Wang, M., Lu, Z., Li, H., Liu, Q.: Syntax-based deep matching of short texts. arXiv:1503.02427 (2015)

  19. Wang, Y., Yan, Z., Li, Z., Chao, W.: Response selection of multi-turn conversation with deep neural networks. In: NLPCC, pp. 110–119 (2018)

    Chapter  Google Scholar 

  20. Wu, Y., Wu, W., Xing, C., Zhou, M., Li, Z.: Sequential matching network: a new architecture for multi-turn response selection in retrieval-based chatbots. In: ACL, vol. 1, pp. 496–505 (2017)

    Google Scholar 

  21. Zhang, Z., Liu, S., Li, M., Zhou, M., Chen, E.: Stack-based multi-layer attention for transition-based dependency parsing. In: EMNLP, pp. 1677–1682 (2017)

    Google Scholar 

  22. Zhou, X., et al.: Multi-view response selection for human-computer conversation. In: EMNLP, pp. 372–381 (2016)

    Google Scholar 

Download references

Acknowledgements

This work is supported by the Science and Technology Commission of Shanghai Municipality Grant (No. 15ZR1410700) and the open project of Shanghai Key Laboratory of Trustworthy Computing (No. 07dz22304201604).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Man Lan or Yuanbin Wu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lu, X., Lan, M., Wu, Y. (2018). Memory-Based Model with Multiple Attentions for Multi-turn Response Selection. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11302. Springer, Cham. https://doi.org/10.1007/978-3-030-04179-3_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04179-3_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04178-6

  • Online ISBN: 978-3-030-04179-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics