Skip to main content

Augmenting Neural Sentence Summarization Through Extractive Summarization

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2017)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10619))

Abstract

Neural sequence-to-sequence model has achieved great success in abstractive summarization task. However, due to the limit of input length, most of previous works can only utilize lead sentences as the input to generate the abstractive summarization, which ignores crucial information of the document. To alleviate this problem, we propose a novel approach to improve neural sentence summarization by using extractive summarization, which aims at taking full advantage of the document information as much as possible. Furthermore, we present both of streamline strategy and system combination strategy to achieve the fusion of the contents in different views, which can be easily adapted to other domains. Experimental results on CNN/Daily Mail dataset demonstrate both our proposed strategies can significantly improve the performance of neural sentence summarization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://cs.nyu.edu/~kcho/DMQA/.

  2. 2.

    https://stanfordnlp.github.io/CoreNLP/.

  3. 3.

    https://github.com/facebook/NAMAS.

  4. 4.

    https://github.com/nyu-dl/dl4mt-tutorial.

  5. 5.

    https://github.com/pltrdy/files2rouge.

References

  1. Alfonseca, E., Pighin, D., Garrido, G.: HEADY: news headline abstraction through event pattern clustering. In: Proceedings of ACL (2013)

    Google Scholar 

  2. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of ICLR (2015)

    Google Scholar 

  3. Barzilay, R., McKeown, K.R.: Sentence fusion for multidocument news summarization. Comput. Linguist. 31(3), 297–328 (2005)

    Article  MATH  Google Scholar 

  4. Cho, K., van Merrienboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of EMNLP (2014)

    Google Scholar 

  5. Chopra, S., Auli, M., Rush., A.M.: Abstractive sentence summarization with attentive recurrent neural networks. In: North American Chapter of the Association for Computational Linguistics (2016)

    Google Scholar 

  6. Erkan, G., Radev, D.R.: LexRank: graph-based lexical centrality as salience in text summarization. J. Qiqihar Junior Teach. Coll. 22 (2011)

    Google Scholar 

  7. Filippova, K., Strube, M.: Sentence fusion via dependency graph compression. In: Proceedings of EMNLP (2008)

    Google Scholar 

  8. Hermann, K.M., Kocisky, T., Grefenstette, E., Espeholt, L., Kay, W., Suleyman, M., Blunsom, P.: Teaching machines to read and comprehend. In: Proceedings of NIPS (2015)

    Google Scholar 

  9. Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of EMNLP (2015)

    Google Scholar 

  10. Nallapati, R., Zhai, F., Zhou, B.: SummaRuNNer: a recurrent neural network based sequence model for extractive summarization of documents. In: Proceedings of AAAI (2017)

    Google Scholar 

  11. Nallapati, R., Zhou, B., glar Gulcehre, C.: Abstractive text summarization using sequence-to-sequence RNNs and beyond. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning (2016)

    Google Scholar 

  12. Napoles, C., Gormley, M., Durme, B.V.: Annotated gigaword. In: Proceedings of the Joint Workshop on Automatic Knowledge Base Construction and Web-Scale Knowledge Extraction (2012)

    Google Scholar 

  13. Page, L., Brin, S., Motwani, R., Winograd, T.: The PageRank Citation Ranking: Bringing Order to the Web (1999)

    Google Scholar 

  14. Pascanu, R., Mikolov, T., Bengio, Y.: On the difficulty of training recurrent neural networks. In: Proceedings of ICML (2013)

    Google Scholar 

  15. Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. In: Proceedings of EMNLP (2015)

    Google Scholar 

  16. See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: Proceedings of ACL (2017)

    Google Scholar 

  17. Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  18. Steinberger, J., Ježek, K.: Text summarization and singular value decomposition. In: Yakhno, T. (ed.) ADVIS 2004. LNCS, vol. 3261, pp. 245–254. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30198-1_25

    Chapter  Google Scholar 

  19. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of Neural Information Processing Systems (2014)

    Google Scholar 

  20. Wang, D., Li, T.: Weighted consensus multi-document summarization. Inf. Process. Manag. 48(3), 513–523 (2012)

    Article  Google Scholar 

  21. Zeiler, M.D.: ADADELTA: an adaptive learning rate method. CoRR (2012)

    Google Scholar 

  22. Zhang, J., Wang, T., Wan, X.: PKUSUMSUM: a Java platform for multilingual document summarization. In: Proceedings of COLING (2016)

    Google Scholar 

  23. Zhou, L., Hu, W., Zhang, J., Zong, C.: Neural system combination for machine translation. In: Proceedings of ACL (2017)

    Google Scholar 

Download references

Acknowledgments

The research work has been funded by the Natural Science Foundation of China under Grant No. 61673380, No. 61402478 and No. 61403379.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chengqing Zong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhu, J., Zhou, L., Li, H., Zhang, J., Zhou, Y., Zong, C. (2018). Augmenting Neural Sentence Summarization Through Extractive Summarization. In: Huang, X., Jiang, J., Zhao, D., Feng, Y., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2017. Lecture Notes in Computer Science(), vol 10619. Springer, Cham. https://doi.org/10.1007/978-3-319-73618-1_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-73618-1_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-73617-4

  • Online ISBN: 978-3-319-73618-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics