Abstract
This paper proposes a two-stage learning pipeline for CQA in the Buddhism domain. In the first stage, we trained an answer selection model through Keywords-BERT that performs a deep semantic match for QA pairs. Given a question, our algorithm selects the answer with the highest relatedness score. Stage two also employs the trained Keywords-BERT model to eliminate redundant information and only keep the most relevant sentences of an answer for summary extraction. Our method only requires standard QA pairs for training, significantly reducing the annotation cost and the knowledge threshold for annotators. We tested our model on a self-created Buddhism CQA dataset. Results show that the proposed pipeline outperforms state-of-the-art methods like BERT-Sum in terms of summary quality and model robustness.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Deng, Y., et al.: Joint learning of answer selection and answer summary generation in community question answering. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 7651–7658 (2020)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Jiao, Z., Sun, S., Sun, K.: Chinese lexical analysis with deep Bi-GRU-CRF network. arXiv preprint arXiv:1807.01882 (2018)
Liu, Y.: Fine-tune BERT for extractive summarization. arXiv preprint arXiv:1903.10318 (2019)
Miao, C., Cao, Z., Tam, Y.C.: Keyword-attentive deep semantic matching. arXiv preprint arXiv:2003.11516 (2020)
Song, H., Ren, Z., Liang, S., Li, P., Ma, J., de Rijke, M.: Summarizing answers in non-factoid community question-answering. In: Proceedings of the 10th ACM International Conference on Web Search and Data Mining, pp. 405–414 (2017)
Su, D., Xu, Y., Yu, T., Siddique, F.B., Barezi, E., Fung, P.: CAiRE-COVID: a question answering and query-focused multi-document summarization system for COVID-19 scholarly information management. In: Proceedings of the 1st Workshop on NLP for COVID-19 (Part 2) at EMNLP 2020. Association for Computational Linguistics (December 2020). Online
Xie, Y., Shen, Y., Li, Y., Yang, M., Lei, K.: Attentive user-engaged adversarial neural network for community question answering. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 9322–9329 (2020)
Zhang, N., Deng, S., Li, J., Chen, X., Zhang, W., Chen, H.: Summarizing Chinese medical answer with graph convolution networks and question-focused dual attention. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings, pp. 15–24 (2020)
Zhang, X., Li, S., Sha, L., Wang, H.: Attentive interactive neural networks for answer selection in community question answering. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 31 (2017)
Zhou, X., Hu, B., Chen, Q., Wang, X.: Recurrent convolutional neural network for answer selection in community question answering. Neurocomputing 274, 8–18 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Du, J., Chen, J., Wang, S., Li, J., Xiao, Z. (2021). Towards a Two-Stage Method for Answer Selection and Summarization in Buddhism Community Question Answering. In: Fang, L., Chen, Y., Zhai, G., Wang, J., Wang, R., Dong, W. (eds) Artificial Intelligence. CICAI 2021. Lecture Notes in Computer Science(), vol 13070. Springer, Cham. https://doi.org/10.1007/978-3-030-93049-3_21
Download citation
DOI: https://doi.org/10.1007/978-3-030-93049-3_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-93048-6
Online ISBN: 978-3-030-93049-3
eBook Packages: Computer ScienceComputer Science (R0)