skip to main content
10.1145/3573942.3573964acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaiprConference Proceedingsconference-collections
research-article

SUMSUG: Augmented Abstractive Text Summarization Model with Semantic Understanding Graphs

Published: 16 May 2023 Publication History

Abstract

As a research hotspot in natural language processing, automatic text summarization has been greatly developed. A summary should be a generalization based on depth understanding of the text. However, the existing models should not understand the semantic information of the text, so the generated summaries deviate from the semantics of the text and have low accuracy. This paper proposes an Augmented Abstractive Text Summarization Model with Semantic Understanding Graphs (SUMSUG). The model uses dual encoders, a text encoder and a graph encoder to guide the generation of summaries. And it obtains the features of the context from the text encoder and the structure features from the graph encoder. By fusing them the model obtains fuller semantic information. We evaluate the model on the Gigaword dataset. The experimental results show that the model performs better than other models, which prove the effectiveness of our model.

References

[1]
Jinpeng L, Chuang Z, Xiaojun C, Survey on Automatic Text Summarization [J]. Journal of Computer Research and Development, 2021, 58(1): 1.
[2]
K.S.Kuppusamy and G.Aghila, "Segmentation Based, Personalized Web Page Summarization Model," Journal of Advances in Information Technology, Vol. 3, No. 3, pp. 155-161, August, 2012.
[3]
Sutskever I, Vinyals O, Le Q V. Sequence to sequence learning with neural networks[J]. Advances in neural information processing systems, 2014, 27.
[4]
Pineda F J. Generalization of back-propagation to recurrent neural networks [J]. Physical review letters, 1987, 59(19): 2229.
[5]
Hochreiter S, Schmidhuber J. Long short-term memory [J]. Neural computation, 1997, 9(8): 1735-1780.
[6]
Velikovi P, Cucurull G, Casanova A, Graph Attention Networks[J]. 2017.
[7]
Rush A M, Chopra S, Weston J . A Neural Attention Model for Abstractive Sentence Summarization[J]. Computer Science, 2015.
[8]
Chopra S, Auli M, Rush A M. Abstractive sentence summarization with attentive recurrent neural networks[C]// Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2016: 93-98.
[9]
Gu J, Lu Z, Li H, Incorporating Copying Mechanism in Sequence-to-Sequence Learning[J]. 2016.
[10]
See A, Liu P J, CD Manning. Get To The Point: Summarization with Pointer-Generator Networks[J]. 2017.
[11]
Paulus R, Xiong C, Socher R . A Deep Reinforced Model for Abstractive Summarization[J]. 2017.
[12]
Rennie S J, Marcheret E, Mroueh Y, Self-critical sequence training for image captioning[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2017: 7008-7024.
[13]
Takase S, Suzuki J, Okazaki N, Neural headline generation on abstract meaning representation[C]// Proceedings of the 2016 conference on empirical methods in natural language processing. 2016: 1054-1059.
[14]
Cao Z, Wei F, Li W, Faithful to the original: Fact aware neural abstractive summarization[C]// Proceedings of the AAAI Conference on Artificial Intelligence. 2018, 32(1).
[15]
Li Z, Peng Z, Tang S, Text summarization method based on double attention pointer network [J]. IEEE Access, 2020, 8: 11279-11288.
[16]
Cui P, Hu L . Topic-Guided Abstractive Multi-Document Summarization[J]. 2021.
[17]
Nguyen T, Luu A T, Lu T, Enriching and Controlling Global Semantics for Text Summarization[J]. 2021.
[18]
Kipf T N, Welling M . Semi-Supervised Classification with Graph Convolutional Networks[J]. 2016.
[19]
Li P, Lam W, Bing L, Deep Recurrent Generative Decoder for Abstractive Text Summarization[J]. 2017.
[20]
Song K, Lebanoff L, Guo Q, Joint parsing and generation for abstractive summarization[C]//Proceedings of the AAAI Conference on Artificial Intelligence. 2020, 34(05): 8894-8901.
[21]
Zhou Y, Cao F, Cao Y, A Grammar-Aware Pointer Network for Abstractive Summarization [M]. Modern Industrial IoT, Big Data and Supply Chain. Springer. 2021: 207-16.

Index Terms

  1. SUMSUG: Augmented Abstractive Text Summarization Model with Semantic Understanding Graphs

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AIPR '22: Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition
    September 2022
    1221 pages
    ISBN:9781450396899
    DOI:10.1145/3573942
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 May 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. GAT
    2. Long short-term memory network
    3. Semantic Understanding Graphs
    4. Seq2Seq
    5. Text summarization

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Key Research and Development Program of Shaanxi Province

    Conference

    AIPR 2022

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 34
      Total Downloads
    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media