Skip to main content

A Context-Aware Model with Flow Mechanism for Conversational Question Answering

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1517))

Included in the following conference series:

  • 1798 Accesses

Abstract

Conversational Question Answering (ConvQA) requires a deep understanding of conversation history to answer the current question. However, most existing works ignore the sequential dependencies among history turns and treat all history indiscriminately. We propose a Flow based Context-Aware Question Answering model to alleviate the above problems. In specific, we first use a hierarchical history selector to filter out irrelated history turns according to the features of word level, utterance level and dialogue level. Then we introduce a FlowRNN to model the sequential dependencies between history turns along dialog direction. Finally we incorporate these hidden dependencies to BERT for answer predictions. Experiments on a large-scale conversational question answering dataset QuAC show that our proposed method can use conversation history effectively and outperforms most of the recent ConvQA models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chen, Y., Wu, L., Zaki, M.J.: GraphFlow: exploiting conversation flow with graph neural networks for conversational machine comprehension. In: Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI 2020, pp. 1230–1236. ijcai.org (2020)

    Google Scholar 

  2. Choi, E., et al.: QuAC: question answering in context. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 31 October–4 November 2018, pp. 2174–2184. Association for Computational Linguistics (2018)

    Google Scholar 

  3. Huang, H., Choi, E., Yih, W.: FlowQA: grasping flow in history for conversational machine comprehension. In: 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, 6–9 May 2019. OpenReview.net (2019)

    Google Scholar 

  4. Qu, C., Yang, L., Qiu, M., Croft, W.B., Zhang, Y., Iyyer, M.: BERT with history answer embedding for conversational question answering. In: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2019, Paris, France, 21–25 July 2019, pp. 1133–1136. ACM (2019)

    Google Scholar 

  5. Qu, C., et al.: Attentive history selection for conversational question answering. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management, CIKM 2019, Beijing, China, 3–7 November 2019, pp. 1391–1400. ACM (2019)

    Google Scholar 

  6. Reddy, S., Chen, D., Manning, C.D.: CoQA: a conversational question answering challenge. CoRR abs/1808.07042 (2018)

    Google Scholar 

  7. Seo, M.J., Kembhavi, A., Farhadi, A., Hajishirzi, H.: Bidirectional attention flow for machine comprehension. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, 24–26 April 2017, Conference Track Proceedings. OpenReview.net (2017)

    Google Scholar 

  8. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4–9 December 2017, Long Beach, CA, USA, pp. 5998–6008 (2017)

    Google Scholar 

  9. Yeh, Y.T., Chen, Y.: FlowDelta: modeling flow information gain in reasoning for conversational machine comprehension. CoRR abs/1908.05117 (2019)

    Google Scholar 

  10. Yuan, C., et al.: Multi-hop selector network for multi-turn response selection in retrieval-based chatbots. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, EMNLP-IJCNLP 2019, Hong Kong, China, 3–7 November 2019, pp. 111–120. Association for Computational Linguistics (2019)

    Google Scholar 

  11. Qiu, M., et al.: Reinforced history backtracking for conversational question answering. In: Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, pp. 13718–13726. AAAI Press (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ailian Fang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Song, D., Yang, Y., Fang, A. (2021). A Context-Aware Model with Flow Mechanism for Conversational Question Answering. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Communications in Computer and Information Science, vol 1517. Springer, Cham. https://doi.org/10.1007/978-3-030-92310-5_72

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-92310-5_72

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-92309-9

  • Online ISBN: 978-3-030-92310-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics