skip to main content
10.1145/3660395.3660464acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaibdfConference Proceedingsconference-collections
research-article

Research on Text Abstract Algorithm Based on Pre trained Large Model

Published: 01 June 2024 Publication History

Abstract

As society enters the era of big data, text summarization tasks are increasingly receiving attention from researchers, with the goal of generating short text summaries that can succinctly and accurately reflect the original meaning for longer texts. This article is based on the requirement of extracting text abstracts and proposes a model for extracting text abstracts based on a pre training framework and introducing comparative learning. The model utilizes a pre training framework to capture text semantics well, and uses comparative learning to solve the exposure bias problem caused by differences between the model in the training and inference stages. Experiments have shown that this algorithm can effectively extract text abstracts while improving the efficiency of text extraction.

References

[1]
Lin Z, Zhou Q F. A Text Abstract Model Based on Key Information Guidance [J]. Journal of Northeast University (Natural Science Edition), 2023, 44 (09): 1251-1258.
[2]
Gong S.Research and Application of Extractive Text Abstraction Method Based on Comparative Learning [D]. Shandong Jiaotong University, 2023.
[3]
Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv: 1409.1556, 2014.
[4]
He K, Zhang X, Ren S, Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition. Piscataway: IEEE, 2016: 770-778.
[5]
Mikolov T, Chen K, Corrado G, Efficient estimation of word representations in vector space. arXiv preprint arXiv: 1301.3781, 2013.
[6]
Peters M, Neumann M, Iyyer M, Deep Contextualized Word Representations. arXiv preprint arXiv: 1802.05365, 2018.
[7]
Devlin J, Chang M W, Lee K, BERT: Pre-training of Deep Bidirectional. Transformers for Language Understanding. arXiv preprint arXiv:1810.04805, 2018.
[8]
Zhong M, Liu P, Chen Y, Extractive summarization as text matching. arXiv preprint arXiv: 2004.08795, 2020.
[9]
Schroff F, Kalenichenko D, Philbin J. Facenet: A unified embedding for face recognition and clustering. In: Proceedings of the IEEE conference on computer vision and pattern recognition. Piscataway:IEEE, 2015: 815-823.
[10]
Wu J H, Zhao Y, Zong C Q. ChatGPT Capability Analysis and Future Prospects [J/OL]. Chinese Science Foundation: 1-8.

Index Terms

  1. Research on Text Abstract Algorithm Based on Pre trained Large Model

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AIBDF '23: Proceedings of the 2023 3rd Guangdong-Hong Kong-Macao Greater Bay Area Artificial Intelligence and Big Data Forum
    September 2023
    577 pages
    ISBN:9798400716362
    DOI:10.1145/3660395
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 June 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    AIBDF 2023

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 12
      Total Downloads
    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 18 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media