Abstract
Query-focused extractive summarization aims to create a summary by selecting sentences from original document according to query relevance and redundancy. With recent advances of neural network models in natural language processing, attention mechanism is widely used to address text summarization task. However, existing methods are always based on a coarse-grained sentence-level attention, which likely to miss the intent of query and cause relatedness misalignment. To address the above problem, we introduce a fine-grained and interactive word-by-word attention to the query-focused extractive summarization system. In that way, we capture the real intent of query. We utilize a Compare-Aggregate model to implement the idea, and simulate the interactively attentive reading and thinking of human behavior. We also leverage external conceptual knowledge to enrich the model and fill the expression gap between query and document. In order to evaluate our method, we conduct experiments on DUC 2005–2007 query-focused summarization benchmark datasets. Experimental results demonstrate that our proposed approach achieves better performance than state-of-the-art.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Dang, H.T.: Overview of Duc 2005. In: Proceedings of DUC, pp. 1–12 (2005)
Wenpeng, Y., Yulong, P.: Optimizing sentence modeling and selection for document summarization. In Proceedings of IJCAI, pp. 1383–1389 (2015)
Ziqiang, C., Furu, W., Sujian, L., Wenjie, L., Ming, Z., Houfeng, W.: Learning summary prior representation for extractive summarization. In: Proceedings of IJCAI, Short Paper, pp. 829–833 (2015)
Ziqiang, C., Wenjie, L., Sujian, L., Furu, W., Yanran, L.:. AttSum: joint learning of focusing and summarization with neural attention. In: Proceedings of COLING, pp. 547–556 (2016)
Preksha, N., Khapra, M.M., Anirban, L., Ravindran, B.: Diversity driven attention model for query-based abstractive summarization. In: Proceedings of ACL, pp. 1063–1072 (2017)
Shuohang, W., Jing, J.: A compare-aggregate model for matching text sequences. In: Proceedings of ICLR (2017)
Parikh Ankur, P., Oscar, T., Dipanjan, D., Jakob, U.: A decomposable attention model for natural language inference. In: Proceedings of EMNLP, pp. 2249–2255 (2016)
Weijie, B., Si, L., Zhao, Y., Guang, C., Zhiqing, L.: A compare-aggregate model with dynamic-clip attention for answer selection. In: Proceedings of CIKM, Short Paper, pages pp. 1987–1990 (2017)
Seunghyun, Y., Franck, D., Doo, K., Soon, B.T., Kyomin, J.: A compare-aggregate model with latent clustering for answer selection. In: Proceedings of CIKM, Short Paper, pp. 2093–2096 (2019)
Arbi, B., Xiaohua, L., Jian-Yun, N.: Integrating multiple resources for diversified query expansion. In: Proceedings of ECIR, pp. 437–442 (2014)
Sarasi, L., Sujan, P., Pavan, K., Amit, S.: Domain-specific hierarchical subgraph extraction: a recommendation use case. In: Proceedings of Big Data, pp. 666–675 (2017)
Sarasi, L., Sujan, P., Pavan, K., Amit, S.: Domain-specific hierarchical subgraph extraction: a recommendation use case. In: Proceedings of Big Data, pp. 666–675 (2017)
Qian, C., Xiaodan, Z., Zhen-Hua, L., Diana, I., Si, W.: Neural natural language inference models enhanced with external knowledge. In: Proceedings of ACL, pp. 2406–2417 (2018)
Robyn, S., Joshua, C., Catherine, H.: ConceptNet 5.5: an open multilingual graph of general knowledge. In: Proceedings of AAAI, pp. 4444–4451 (2017)
Jacob, D., Ming-Wei, C., Kenton, L., Kristina, T.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019)
Ashish, V., et al.: Attention is all you need. In: Proceedings of NIPS, pp. 5998–6008 (2017)
Robyn, S., Joanna, L.-D.: ConceptNet at SemEval-2017 Task 2: extending word embeddings with multilingual relational knowledge. In: Proceedings of SemEval workshop at ACL 2017, pp. 85–89 (2017)
Yoon, K.: Convolutional neural networks for sentence classification. In: Proceedings of EMNLP, pp. 1746–1751 (2014)
Guy, F., Haggai, R., Odellia, B., David, K.: Unsupervised query-focused multi-document summarization using the cross entropy method. In: Proceedings of SIGIR, Short Paper, pp. 961–964 (2017)
Jaime, C., Jade, G.: The use of MMR, diversity-based reranking for reordering documents and producing summaries. In: Proceedings of SIGIR, Short Paper, pp. 335–336 (1998)
Xiaojun, W., Jianguo, X.: Graph-based multi-modality learning for topic-focused multi-document summarization. In: Proceedings of IJCAI, pp. 1586–1591 (2009)
Sheng-hua, Z., Yan, L., Bin, L., Jing, L.: Query-oriented unsupervised multi-document summarization via deeplearning model. Expert Syst. Appl. 42(21), 8146–8155 (2015)
Mittul, S., Arunav, M.: Long-span language models for query-focused unsupervised extractive text summarization. In: Proceedings of ECIR, pp. 657–664 (2018)
Michel, G.: A skip-chain conditional random field for ranking meeting utterances by importance. In: Proceedings of EMNLP, pp. 364–372 (2006)
You, O., Wenjie, L., Sujian, L., Qin, L.: Applying regression models to query-focused multidocument summarization. Inf. Process. Manage. 47(2), 227–237 (2011)
Chen, L., Xian, Q., Yang, L.: Using supervised bigram-based ILP for extractive summarization. In: Proceedings of ACL, pp. 1004–1013 (2013)
Chao, S., Tao,L.: Learning to rank for query-focused multi-document summarization. In: Proceedings of ICDM, pp. 626–634 (2011)
Jianpeng, C., Lapata, M.: Neural summarization by extracting sentences and words. In: Proceedings of ACL, pp. 484–494 (2016)
Pengjie, R., Zhumin, C.: Sentence relation for extractive summarization with deep neural network. TOIS 36(4), 1–32 (2018)
Kobayashi Hayato, M.N., Yatsuka, T.: Summarization based on embedding distributions. In: Proceedings of EMNLP, pp. 1984–1989 (2015)
Yanran, L., Li, S.: Query-focused multi-document summarization: combining a topic model with graph-based semi-supervised learning. In: Proceedings of COLING, pp. 1197–1207 (2014)
Tatsuya, I., Kazuya, M., Hayato, K., Hiroya, T., Manabu, O.: Distant supervision for extractive question summarization. In: Proceedings of ECIR, pp. 182–189 (2020)
Acknowledgement
This work is supported in part by the Project, Grant No. BMKY2019B04-1 and the Strategic Priority Research Program of Chinese Academy of Sciences, Grant No. XDC02040400.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Ya, J., Liu, T., Guo, L. (2020). A Compare-Aggregate Model with External Knowledge for Query-Focused Summarization. In: Huang, Z., Beek, W., Wang, H., Zhou, R., Zhang, Y. (eds) Web Information Systems Engineering – WISE 2020. WISE 2020. Lecture Notes in Computer Science(), vol 12343. Springer, Cham. https://doi.org/10.1007/978-3-030-62008-0_5
Download citation
DOI: https://doi.org/10.1007/978-3-030-62008-0_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-62007-3
Online ISBN: 978-3-030-62008-0
eBook Packages: Computer ScienceComputer Science (R0)