Skip to main content

An Abstractive Summarization Method Based on Global Gated Dual Encoder

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2020)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 12431))

Abstract

The sequence-to-sequence model based on the RNN attention mechanism has been well applied in abstractive summarization, but the existing models generally cannot capture long-term information because of the defects of RNN. So an abstractive text summarization method is proposed in this paper, which is based on global gated double encoding (GDE). Combined with Transformer to extract global semantics, a global gating unit based on dual encoder is designed that can filter the key information to prevent the redundant information, and the problem of insufficient semantics is compensated dynamically. Many experiments on the LCSTS Chinese and CNN/Daily Mail English datasets show that our model is superior to the current advanced generative methods.

This paper is Supported by the National Key Research and Development Program of China under Grant No. 2018YFC0832102 and National Natural Science Foundation of China (Grant No. 61936001).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, 5998–6008 (2017)

    Google Scholar 

  2. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)

    Google Scholar 

  3. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)

  4. Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1412–1421 (2015)

    Google Scholar 

  5. Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. ACL Press, Lisbon, pp. 379–389 (2015)

    Google Scholar 

  6. Hu, B., Chen, Q., Zhu, F.: LCSTS: a large scale chinese short text summarization dataset. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1967–1972 (2015)

    Google Scholar 

  7. Nallapati, R., Zhou, B., Santos, C., et al.: Abstractive text summarization using sequence-to-sequence RNNs and beyond. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. ACL Press, Berlin, pp. 280–290 (2016)

    Google Scholar 

  8. Zeng, W., Luo, W., Fidler, S., et al.: Efficient summarization with read-again and copy mechanism. arXiv preprint arXiv:1611.03382 (2016)

  9. Gu, J., Lu, Z., Li, Hang, et al.: Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. ACL Press, Berlin, pp. 1631–1640 (2016)

    Google Scholar 

  10. Gulcehre, C., Ahn, S., Nallapati, R., et al.: Pointing the unknown words. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Long Papers), vol. 1, pp. 140–149 (2016)

    Google Scholar 

  11. See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Long Papers), vol. 1, pp. 1073–1083 (2017)

    Google Scholar 

  12. Li, P., Lam, W., Bing, L., et al.: Deep recurrent generative decoder for abstractive text summarization. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 2091–2100 (2017)

    Google Scholar 

  13. Ma, S., Sun, X., Xu, J., et al.: Improving semantic relevance for sequence-to-sequence learning of chinese social media text summarization. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Short Papers), vol. 2, pp. 635–640 (2017)

    Google Scholar 

  14. Ma, S., Sun, X., Lin, J., et al.: Autoencoder as assistant supervisor: improving text representation for chinese social media text summarization. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Short Papers), vol. 2, pp. 725–731 (2018)

    Google Scholar 

  15. Lin, C.: Rouge: a package for automatic evaluation of summaries. In: Proceedings of the ACL Workshop: Text Summarization Braches Out, Barcelona, pp. 74–81 (2004)

    Google Scholar 

  16. Wang, W., Yang, N., Wei, F., et al.: Gated self-matching networks for reading comprehension and question answering. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Long Papers), vol. 1, pp. 189–198 (2017)

    Google Scholar 

  17. Nallapati, R., Zhou, B., Santos, C.N.D., et al.: Abstractive text summarization using sequence-to-sequence RNNs and beyond. In: SIGNLL Conference on Computational Natural Language Learning, pp. 280–290 (2016)

    Google Scholar 

  18. Tan, J., Wan, X., Xiao, J.: Abstractive document summarization with a graph-based attentional neural model. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Long Papers), vol. 1, pp. 1171–1181 (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qun Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Peng, L., Liu, Q., Lv, L., Deng, W., Wang, C. (2020). An Abstractive Summarization Method Based on Global Gated Dual Encoder. In: Zhu, X., Zhang, M., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2020. Lecture Notes in Computer Science(), vol 12431. Springer, Cham. https://doi.org/10.1007/978-3-030-60457-8_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-60457-8_29

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-60456-1

  • Online ISBN: 978-3-030-60457-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics