Abstract
In this paper, we study the task of multi-turn response selection in retrieval-based dialogue systems. Previous approaches focus on matching response with utterances in the context to distill important matching information, and modeling sequential relationship among utterances. This kind of approaches do not take into account the position relationship and inner semantic relevance between utterances and query (i.e., the last utterance). We propose a memory-based network (MBN) to build the effective memory integrating position relationship and inner semantic relevance between utterances and query. Then we adopt multiple attentions on the memory to learn representations of context with multiple levels, which is similar to the behavior of human that repetitively think before response. Experimental results on a public data set for multi-turn response selection show the effectiveness of our MBN model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chen, P., Sun, Z., Bing, L., Yang, W.: Recurrent attention network on memory for aspect sentiment analysis. In: EMNLP, pp. 452–461 (2017)
Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: NIPS, pp. 2042–2050 (2014)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv:1412.6980 (2014)
LeCun, Y., et al.: Backpropagation applied to handwritten zip code recognition. Neural Comput. 1(4), 541–551 (1989)
Li, F.L., et al.: AliMe assist: an intelligent assistant for creating an innovative e-commerce experience. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp. 2495–2498. ACM (2017)
Logeswaran, L., Lee, H., Radev, D.: Sentence ordering and coherence modeling using recurrent neural networks. arXiv:1611.02654 (2018)
Lowe, R., Pow, N., Serban, I., Pineau, J.: The Ubuntu dialogue corpus: a large dataset for research in unstructured multi-turn dialogue systems. arXiv:1506.08909 (2015)
Lu, X., Lan, M., Wu, Y.: Memory-based matching models for multi-turn response selection in retrieval-based chatbots. In: NLPCC, pp. 294–303 (2018)
Lu, Z., Li, H.: A deep architecture for matching short texts. In: NIPS, pp. 1367–1375 (2013)
Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS, pp. 3111–3119 (2013)
Serban, I.V., Sordoni, A., Bengio, Y., Courville, A.C., Pineau, J.: Building end-to-end dialogue systems using generative hierarchical neural network models. In: AAAI, vol. 16, pp. 3776–3784 (2016)
Serban, I.V., et al.: A hierarchical latent variable encoder-decoder model for generating dialogues. In: AAAI, pp. 3295–3301 (2017)
Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. In: ACL, vol. 1, pp. 1577–1586 (2015)
Shum, H.Y., He, X., Li, D.: From Eliza to Xiaoice: challenges and opportunities with social chatbots. arXiv preprint arXiv:1801.01957 (2018)
Sordoni, A., et al.: A neural network approach to context-sensitive generation of conversational responses. In: NAACL, pp. 196–205 (2015)
Wang, M., Lu, Z., Li, H., Liu, Q.: Syntax-based deep matching of short texts. arXiv:1503.02427 (2015)
Wang, Y., Yan, Z., Li, Z., Chao, W.: Response selection of multi-turn conversation with deep neural networks. In: NLPCC, pp. 110–119 (2018)
Wu, Y., Wu, W., Xing, C., Zhou, M., Li, Z.: Sequential matching network: a new architecture for multi-turn response selection in retrieval-based chatbots. In: ACL, vol. 1, pp. 496–505 (2017)
Zhang, Z., Liu, S., Li, M., Zhou, M., Chen, E.: Stack-based multi-layer attention for transition-based dependency parsing. In: EMNLP, pp. 1677–1682 (2017)
Zhou, X., et al.: Multi-view response selection for human-computer conversation. In: EMNLP, pp. 372–381 (2016)
Acknowledgements
This work is supported by the Science and Technology Commission of Shanghai Municipality Grant (No. 15ZR1410700) and the open project of Shanghai Key Laboratory of Trustworthy Computing (No. 07dz22304201604).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Lu, X., Lan, M., Wu, Y. (2018). Memory-Based Model with Multiple Attentions for Multi-turn Response Selection. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11302. Springer, Cham. https://doi.org/10.1007/978-3-030-04179-3_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-04179-3_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04178-6
Online ISBN: 978-3-030-04179-3
eBook Packages: Computer ScienceComputer Science (R0)