Skip to main content

An Abstractive Text Summarization Using Recurrent Neural Network

  • Conference paper
  • First Online:
Computational Linguistics and Intelligent Text Processing (CICLing 2018)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13397))

  • 191 Accesses

Abstract

With the accelerated advancement of technology and massive content surging over the Internet, it has become an arduous task to abstract the information efficiently. However, automatic text summarization provides an acceptable means for fast procurement of such information in the form of a summary through compression and refinement. Abstractive text summarization, in particular, builds an internal semantic representation of the text and uses natural language generation techniques to create summaries closer to human-generated summaries. This paper uses Long Short Term Memory (LSTM) based Recurrent Neural Network to generate comprehensive abstractive summaries. To train LSTM based model requires a corpus having a significant number of instances containing parallel running article and summary pairs. For this purpose, we have used various news corpus, namely DUC 2003, DUC 2004 and Gigaword corpus, after eliminating the noise and other irrelevant data. Experiments and analyses of this work are performed on a subset of these whole corpora and evaluated using ROUGE evaluation. The experimental result verifies the accuracy and validity of the proposed system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://opennmt.net.

  2. 2.

    http://duc.nist.gov/.

  3. 3.

    https://catalog.ldc.upenn.edu/LDC2002T31.

References

  1. Baxendale, P.B.: Machine-made index for technical literature-an experiment. IBM J. Res. Dev. 2(4), 354–361 (1958)

    Article  Google Scholar 

  2. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)

  3. Clarke, J., Lapata, M.: Discourse constraints for document compression. Comput. Linguist. 36(3), 411–441 (2010)

    Article  Google Scholar 

  4. Edmundson, H.P.: New methods in automatic extracting. J. ACM (JACM) 16(2), 264–285 (1969)

    Article  Google Scholar 

  5. Gers, F.A., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. Neural Comput. 12(10), 2451–2471 (2000)

    Article  Google Scholar 

  6. Klein, G., Kim, Y., Deng, Y., Senellart, J., Rush, A.M.: OpenNMT: open-source toolkit for neural machine translation. arXiv preprint arXiv:1701.02810 (2017)

  7. Lin, C.Y.: Improving summarization performance by sentence compression-a pilot study. In: Proceedings of the Sixth International Workshop on Information Retrieval with Asian Languages, pp. 1–8 (2003)

    Google Scholar 

  8. Lin, C.Y.: Rouge: a package for automatic evaluation of summaries. In: Text Summarization Branches Out, pp. 74–81 (2004)

    Google Scholar 

  9. Luhn, H.P.: The automatic creation of literature abstracts. IBM J. Res. Dev. 2(2), 159–165 (1958)

    Article  Google Scholar 

  10. Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025 (2015)

  11. Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. arXiv preprint arXiv:1509.00685 (2015)

  12. Shardan, R., Kulkarni, U.: Implementation and evaluation of evolutionary connectionist approaches to automated text summarization (2010)

    Google Scholar 

  13. Sutskever Ilya, V.O., V, L.Q.: Sequence to Sequence Learning with Neural Networks, pp. 3104–3112. Curran Associates, Inc. (2014)

    Google Scholar 

  14. Wang, S., Zhao, X., Li, B., Ge, B., Tang, D.: Integrating extractive and abstractive models for long text summarization. In: 2017 IEEE International Congress on Big Data (BigData Congress), pp. 305–312. IEEE (2017)

    Google Scholar 

  15. Wu, Y., et al.: Google’s neural machine translation system: bridging the gap between human and machine translation. arXiv preprint arXiv:1609.08144 (2016)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dipanwita Debnath .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Debnath, D., Pakray, P., Das, R., Gelbukh, A. (2023). An Abstractive Text Summarization Using Recurrent Neural Network. In: Gelbukh, A. (eds) Computational Linguistics and Intelligent Text Processing. CICLing 2018. Lecture Notes in Computer Science, vol 13397. Springer, Cham. https://doi.org/10.1007/978-3-031-23804-8_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-23804-8_29

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-23803-1

  • Online ISBN: 978-3-031-23804-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics