Abstract
With the accelerated advancement of technology and massive content surging over the Internet, it has become an arduous task to abstract the information efficiently. However, automatic text summarization provides an acceptable means for fast procurement of such information in the form of a summary through compression and refinement. Abstractive text summarization, in particular, builds an internal semantic representation of the text and uses natural language generation techniques to create summaries closer to human-generated summaries. This paper uses Long Short Term Memory (LSTM) based Recurrent Neural Network to generate comprehensive abstractive summaries. To train LSTM based model requires a corpus having a significant number of instances containing parallel running article and summary pairs. For this purpose, we have used various news corpus, namely DUC 2003, DUC 2004 and Gigaword corpus, after eliminating the noise and other irrelevant data. Experiments and analyses of this work are performed on a subset of these whole corpora and evaluated using ROUGE evaluation. The experimental result verifies the accuracy and validity of the proposed system.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Baxendale, P.B.: Machine-made index for technical literature-an experiment. IBM J. Res. Dev. 2(4), 354–361 (1958)
Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)
Clarke, J., Lapata, M.: Discourse constraints for document compression. Comput. Linguist. 36(3), 411–441 (2010)
Edmundson, H.P.: New methods in automatic extracting. J. ACM (JACM) 16(2), 264–285 (1969)
Gers, F.A., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. Neural Comput. 12(10), 2451–2471 (2000)
Klein, G., Kim, Y., Deng, Y., Senellart, J., Rush, A.M.: OpenNMT: open-source toolkit for neural machine translation. arXiv preprint arXiv:1701.02810 (2017)
Lin, C.Y.: Improving summarization performance by sentence compression-a pilot study. In: Proceedings of the Sixth International Workshop on Information Retrieval with Asian Languages, pp. 1–8 (2003)
Lin, C.Y.: Rouge: a package for automatic evaluation of summaries. In: Text Summarization Branches Out, pp. 74–81 (2004)
Luhn, H.P.: The automatic creation of literature abstracts. IBM J. Res. Dev. 2(2), 159–165 (1958)
Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025 (2015)
Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. arXiv preprint arXiv:1509.00685 (2015)
Shardan, R., Kulkarni, U.: Implementation and evaluation of evolutionary connectionist approaches to automated text summarization (2010)
Sutskever Ilya, V.O., V, L.Q.: Sequence to Sequence Learning with Neural Networks, pp. 3104–3112. Curran Associates, Inc. (2014)
Wang, S., Zhao, X., Li, B., Ge, B., Tang, D.: Integrating extractive and abstractive models for long text summarization. In: 2017 IEEE International Congress on Big Data (BigData Congress), pp. 305–312. IEEE (2017)
Wu, Y., et al.: Google’s neural machine translation system: bridging the gap between human and machine translation. arXiv preprint arXiv:1609.08144 (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 Springer Nature Switzerland AG
About this paper
Cite this paper
Debnath, D., Pakray, P., Das, R., Gelbukh, A. (2023). An Abstractive Text Summarization Using Recurrent Neural Network. In: Gelbukh, A. (eds) Computational Linguistics and Intelligent Text Processing. CICLing 2018. Lecture Notes in Computer Science, vol 13397. Springer, Cham. https://doi.org/10.1007/978-3-031-23804-8_29
Download citation
DOI: https://doi.org/10.1007/978-3-031-23804-8_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-23803-1
Online ISBN: 978-3-031-23804-8
eBook Packages: Computer ScienceComputer Science (R0)