skip to main content
10.1145/3477495.3531901acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

QSG Transformer: Transformer with Query-Attentive Semantic Graph for Query-Focused Summarization

Published: 07 July 2022 Publication History

Abstract

Query-Focused Summarization (QFS) is a task that aims to extract essential information from a long document and organize it into a summary that can answer a query. Recently, Transformer-based summarization models have been widely used in QFS. However, the simple Transformer architecture cannot utilize the relationships between distant words and information from a query directly. In this study, we propose the QSG Transformer, a novel QFS model that leverages structure information on Query-attentive Semantic Graph (QSG) to address these issues. Specifically, in the QSG Transformer, QSG node representation is improved by a proposed query-attentive graph attention network, which spreads the information of the query node into QSG using Personalized PageRank, and it is used to generate a summary that better reflects the information from the relationships of a query and document. The proposed method is evaluated on two QFS datasets, and it achieves superior performances over the state-of-the-art models.

References

[1]
Hoa Trang Dang. 2006. DUC 2005: Evaluation of Question-Focused Summarization Systems. In Proceedings of the Workshop on Task-Focused Summarization and Question Answering (Sydney, Australia) (SumQA '06). Association for Computational Linguistics, USA, 48--55.
[2]
Yang Deng, Wenxuan Zhang, and Wai Lam. 2020. Multi-hop Inference for Question-driven Summarization. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, Online, 6734--6744. https://doi.org/10.18653/v1/2020.emnlpmain.547
[3]
Yang Deng, Wenxuan Zhang, Yaliang Li, Min Yang, Wai Lam, and Ying Shen. 2020. Bridging Hierarchical and Sequential Context Modeling for Question-Driven Extractive Answer Summarization. Association for Computing Machinery, New York, NY, USA, 1693--1696. https://doi.org/10.1145/3397271.3401208
[4]
Hanqi Jin, Tianming Wang, and Xiaojun Wan. 2020. SemSUM: Semantic Dependency Guided Neural Abstractive Summarization. In The Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI 2020, The Thirty-Second Innovative Applications of Artificial Intelligence Conference, IAAI 2020, The Tenth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2020, New York, NY, USA, February 7--12, 2020. AAAI Press, 8026--8033. https://aaai.org/ojs/index. php/AAAI/article/view/6312
[5]
Qiao Jin, Bhuwan Dhingra, Zhengping Liu, William Cohen, and Xinghua Lu. 2019.QA: A Dataset for Biomedical Research Question Answering. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Association for Computational Linguistics, Hong Kong, China, 2567--2577. https://doi.org/10.18653/v1/D19--1259
[6]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24--26, 2017, Conference Track Proceedings. OpenReview.net. https://openreview.net/forum?id=SJU4ayYgl
[7]
Johannes Klicpera, Aleksandar Bojchevski, and Stephan Günnemann. 2019. Predict then Propagate: Graph Neural Networks meet Personalized PageRank. In 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, May 6--9, 2019. OpenReview.net. https://openreview.net/forum?id=H1gL2A9Ym
[8]
Md. Tahmid Rahman Laskar, Enamul Hoque, and Jimmy Huang. 2020. Query Focused Abstractive Summarization via Incorporating Query Relevance and Transfer Learning with Transformer Models. In Advances in Artificial Intelligence - 33rd Canadian Conference on Artificial Intelligence, Canadian AI 2020, Ottawa, ON, Canada, May 13--15, 2020, Proceedings (Lecture Notes in Computer Science, Vol. 12109), Cyril Goutte and Xiaodan Zhu (Eds.). Springer, 342--348. https: //doi.org/10.1007/978--3-030--47358--7_35
[9]
Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Veselin Stoyanov, and Luke Zettlemoyer. 2020. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Online, 7871--7880. https://doi.org/10.18653/v1/2020.acl-main.703
[10]
Chin-Yew Lin. 2004. ROUGE: A Package for Automatic Evaluation of Summaries. In Text Summarization Branches Out. Association for Computational Linguistics, Barcelona, Spain, 74--81. https://aclanthology.org/W04--1013
[11]
Yang Liu and Mirella Lapata. 2019. Text Summarization with Pretrained Encoders. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Association for Computational Linguistics, Hong Kong, China, 3730--3740. https://doi.org/10.18653/v1/D19--1387
[12]
Ilya Loshchilov and Frank Hutter. 2019. Decoupled Weight Decay Regularization. In 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, May 6--9, 2019. OpenReview.net. https://openreview.net/forum? id=Bkg6RiCqY7
[13]
Christopher Manning, Mihai Surdeanu, John Bauer, Jenny Finkel, Steven Bethard, and David McClosky. 2014. The Stanford CoreNLP Natural Language Processing Toolkit. In Proceedings of 52nd Annual Meeting of the Association for Computational Linguistics: System Demonstrations. Association for Computational Linguistics, Baltimore, Maryland, 55--60. https://doi.org/10.3115/v1/P14--5010
[14]
Preksha Nema, Mitesh M. Khapra, Anirban Laha, and Balaraman Ravindran. 2017. Diversity driven attention model for query-based abstractive summarization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Association for Computational Linguistics, Vancouver, Canada, 1063--1072. https://doi.org/10.18653/v1/P17--1098
[15]
Lawrence Page, Sergey Brin, Rajeev Motwani, and Terry Winograd. 1999. The PageRank citation ranking: Bringing order to the web. Technical Report. Stanford InfoLab.
[16]
Dan Su, Tiezheng Yu, and Pascale Fung. 2021. Improve Query Focused Abstractive Summarization by Incorporating Answer Relevance. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Association for Computational Linguistics, Online, 3124--3131. https://doi.org/10.18653/v1/2021.findings-acl.275
[17]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is All you Need. In Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, December 4- 9, 2017, Long Beach, CA, USA, Isabelle Guyon, Ulrike von Luxburg, Samy Bengio, Hanna M. Wallach, Rob Fergus, S. V. N. Vishwanathan, and Roman Garnett (Eds.). 5998--6008. https://proceedings.neurips.cc/paper/2017/hash/ 3f5ee243547dee91fbd053c1c4a845aa-Abstract.html
[18]
Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. In 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, April 30 - May 3, 2018, Conference Track Proceedings. OpenReview.net. https://openreview. net/forum?id=rJXMpikCZ
[19]
Wenhao Wu, Wei Li, Xinyan Xiao, Jiachen Liu, Ziqiang Cao, Sujian Li, Hua Wu, and Haifeng Wang. 2021. BASS: Boosting Abstractive Summarization with Unified Semantic Graph. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Association for Computational Linguistics, Online, 6052--6067. https://doi.org/10.18653/v1/2021.acl-long.472
[20]
Yujia Xie, Tianyi Zhou, Yi Mao, and Weizhu Chen. 2020. Conditional self-attention for query-based summarization. arXiv preprint arXiv:2002.07338 (2020).

Cited By

View all
  • (2024)Flexible and Adaptable Summarization via Expertise SeparationProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3626772.3657789(2018-2027)Online publication date: 10-Jul-2024
  • (2024)STRA: A Simple Token Replacement Strategy Alleviating Exposure Bias in Text Generation2024 IEEE International Conference on Multimedia and Expo (ICME)10.1109/ICME57554.2024.10687864(1-6)Online publication date: 15-Jul-2024
  • (2024)Local structure-aware graph contrastive representation learningNeural Networks10.1016/j.neunet.2023.12.037172(106083)Online publication date: Apr-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGIR '22: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
July 2022
3569 pages
ISBN:9781450387323
DOI:10.1145/3477495
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 July 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. graph neural networks
  2. graph-based method
  3. query-focused summarization

Qualifiers

  • Short-paper

Funding Sources

  • National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT)
  • Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT)
  • Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea Government (MSIT)

Conference

SIGIR '22
Sponsor:

Acceptance Rates

Overall Acceptance Rate 792 of 3,983 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)84
  • Downloads (Last 6 weeks)9
Reflects downloads up to 28 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Flexible and Adaptable Summarization via Expertise SeparationProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3626772.3657789(2018-2027)Online publication date: 10-Jul-2024
  • (2024)STRA: A Simple Token Replacement Strategy Alleviating Exposure Bias in Text Generation2024 IEEE International Conference on Multimedia and Expo (ICME)10.1109/ICME57554.2024.10687864(1-6)Online publication date: 15-Jul-2024
  • (2024)Local structure-aware graph contrastive representation learningNeural Networks10.1016/j.neunet.2023.12.037172(106083)Online publication date: Apr-2024
  • (2024)Query-focused summarization with the context-graph information fusion transformerExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122699241:COnline publication date: 25-Jun-2024
  • (2023)A Lightweight Constrained Generation Alternative for Query-focused SummarizationProceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3539618.3591936(1745-1749)Online publication date: 19-Jul-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media