Loading [MathJax]/extensions/MathMenu.js
Dual Encoding for Abstractive Text Summarization | IEEE Journals & Magazine | IEEE Xplore

Dual Encoding for Abstractive Text Summarization


Abstract:

Recurrent neural network-based sequence-to-sequence attentional models have proven effective in abstractive text summarization. In this paper, we model abstractive text s...Show More

Abstract:

Recurrent neural network-based sequence-to-sequence attentional models have proven effective in abstractive text summarization. In this paper, we model abstractive text summarization using a dual encoding model. Different from the previous works only using a single encoder, the proposed method employs a dual encoder including the primary and the secondary encoders. Specifically, the primary encoder conducts coarse encoding in a regular way, while the secondary encoder models the importance of words and generates more fine encoding based on the input raw text and the previously generated output text summarization. The two level encodings are combined and fed into the decoder to generate more diverse summary that can decrease repetition phenomenon for long sequence generation. The experimental results on two challenging datasets (i.e., CNN/DailyMail and DUC 2004) demonstrate that our dual encoding model performs against existing methods.
Published in: IEEE Transactions on Cybernetics ( Volume: 50, Issue: 3, March 2020)
Page(s): 985 - 996
Date of Publication: 02 November 2018

ISSN Information:

PubMed ID: 30403646

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.