Abstract
Abstractive text summarization is an important task in natural language generation, which aims to compress input documents and generate concise and informative summaries. Sequence-to-Sequence (Seq2 Seq) models have achieved good results in abstractive text summarization in recent years. However, such models are often sensitive to noise information in the training data and exhibit fragility in practical applications. To enhance the denoising ability of the models, we propose a Multi-Granularity Contrastive Siamese Networks for Abstractive Text Summarization. Specifically, we first perform word-level and sentence-level data augmentation on the input text and integrate the noise information of the two granularities into the input text to generate augmented text pairs with diverse noise information. Then, we jointly train the Seq2Seq model using contrastive learning to maximize the consistency between the representations of the augmented text pairs through a Siamese network. We conduct empirical experiments on the CNN/Daily Mail and XSum datasets. Compared to many existing benchmarks, the results validate the effectiveness of our model.
Project supported by the National Natural Science Foundation of China (62176145).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Lewis, M., et al.: Bart: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv preprint arXiv:1910.13461 (2019)
Zhang, J., Zhao, Y., Saleh, M., Liu, P.: Pegasus: pre-training with extracted gap-sentences for abstractive summarization. In: International Conference on Machine Learning, pp. 11328–11339. PMLR. (2020)
Jia, R., Liang, P.: Adversarial examples for evaluating reading comprehension systems. arXiv preprint arXiv:1707.07328 (2017)
Zheng, C., Zhang, K., Wang, H. J., Fan, L., Wang, Z.: Enhanced Seq2Seq autoencoder via contrastive learning for abstractive text summarization. In: 2021 IEEE International Conference on Big Data (Big Data), pp. 1764–1771. IEEE. (2021)
Zhou, K., Zhang, B., Zhao, W. X., Wen, J. R.: Debiased contrastive learning of unsupervised sentence representations. arXiv preprint arXiv:2205.00656 (2022)
Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: International Conference on Machine Learning, pp. 1597–1607. PMLR. (2020)
He, K., Fan, H., Wu, Y., Xie, S., Girshick, R.: Momentum contrast for unsupervised visual representation learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9729–9738 (2020)
Grill, J.B., et al.: Bootstrap your own latent-a new approach to self-supervised learning. In: Advances in Neural Information Processing Systems, vol. 33, 21271–21284 (2020)
Lee, S., Lee, D.B., Hwang, S.J.: Contrastive learning with adversarial perturbations for conditional text generation. arXiv preprint arXiv:2012.07280 (2020)
Liu, Y., Liu, P.: SimCLS: a simple framework for contrastive learning of abstractive summarization. arXiv preprint arXiv:2106.01890 (2021)
Sun, S., Li, W.: Alleviating exposure bias via contrastive learning for abstractive text summarization. arXiv preprint arXiv:2108.11846 (2021)
Gu, J., Lu, Z., Li, H., Li, V. O.: Incorporating copying mechanism in sequence-to-sequence learning. arXiv preprint arXiv:1603.06393 (2016)
See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. arXiv preprint arXiv:1704.04368 (2017)
Li, H., et al.: Learn to copy from the copying history: correlational copy network for abstractive summarization. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 4091–4101 (2021)
Qi, W., et al.: Prophetnet: predicting future n-gram for sequence-to-sequence pre-training. arXiv preprint arXiv:2001.04063 (2020)
Gao, T., Yao, X., Chen, D.: Simcse: simple contrastive learning of sentence embeddings. arXiv preprint arXiv:2104.08821 (2021)
Qiu, S., et al.: Easyaug: an automatic textual data augmentation platform for classification tasks. In: Companion Proceedings of the Web Conference 2020, pp. 249–252 (2020)
Hermann, K.M., et al.: Teaching machines to read and comprehend. In: Advances in Neural Information Processing Systems, vol. 28 (2015)
Narayan, S., Cohen, S.B., Lapata, M.: Don’t give me the details, just the summary! topic-aware convolutional neural networks for extreme summarization. arXiv preprint arXiv:1808.08745 (2018)
Liu, Y., Lapata, M.: Text summarization with pretrained encoders. arXiv preprint arXiv:1908.08345 (2019)
Xie, Q., Huang, J., Saha, T., Ananiadou, S.: GRETEL: graph contrastive topic enhanced language model for long document extractive summarization. arXiv preprint arXiv:2208.09982 (2022)
Ji, X., Zhao, W.: SKGSUM: abstractive document summarization with semantic knowledge graphs. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2021)
Qiu, Y., Cohen, S.B.: Abstractive summarization guided by latent hierarchical document structure. arXiv preprint arXiv:2211.09458 (2022)
Liu, W., Wu, H., Mu, W., Li, Z., Chen, T., Nie, D.: CO2Sum: contrastive learning for factual-consistent abstractive summarization. arXiv preprint arXiv:2112.01147 (2021)
Xu, S., Zhang, X., Wu, Y., Wei, F.: Sequence level contrastive learning for text summarization. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 10, pp. 11556–11565 (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhang, H., Li, K., Zhang, G., Guan, Y., Li, R. (2024). Multi-granularity Contrastive Siamese Networks for Abstractive Text Summarization. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Communications in Computer and Information Science, vol 1966. Springer, Singapore. https://doi.org/10.1007/978-981-99-8148-9_16
Download citation
DOI: https://doi.org/10.1007/978-981-99-8148-9_16
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8147-2
Online ISBN: 978-981-99-8148-9
eBook Packages: Computer ScienceComputer Science (R0)