Skip to main content

Improving Event Representation for Script Event Prediction via Data Augmentation and Integration

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14303))

  • 782 Accesses

Abstract

Script event prediction aims to predict the most likely following events, given the historical events in the script. This task requires the capability to learn more information between events. Most previous methods mainly focused on the current local information, while ignoring more inner semantic features of events. In this work, we propose a novel framework, called ECer, which can obtain more comprehensive event information by utilizing data augmentation and information integration. We first employ rectified linear attention to connect the initial event representation at the argument level. Then, to learn richer semantic information, data augmentation is further applied to expand data and introduce external knowledge. Furthermore, the initial representation and the features of augmented data were mixed by Mixup. Finally, an attention module is applied to the context event chain to integrate context events concerning the current candidate event dynamically. The experimental results on the widely-used New York Times corpus demonstrate the effectiveness and superiority of the proposed model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Schank, R.C., Abelson, R.P.: Scripts, plans, and knowledge. IJCAI 75, 151–157. Morgan Kaufmann Publishers Inc, San Francisco (1975)

    Google Scholar 

  2. Mikolov, T., Sutskever, I., Chen, K., et al.: Distributed representations of words and phrases and their compositionality. Advances in neural information processing systems. Journal 26(2), 3111–3119 (2013)

    Google Scholar 

  3. Mikolov, T., Chen, K., Corrado, G., et al.: Efficient estimation of word representations in vector space. arXiv preprint arXiv (2013)

    Google Scholar 

  4. Weber, N., Balasubramanian, N., Chambers, N.: Event representations with tensor-based compositions. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 4946–4953. AAAI Press, Palo Alto (2018)

    Google Scholar 

  5. Socher, R., Huval, B., Manning, C.D., et al.: Semantic compositionality through recursive matrix-vector spaces. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 1201–1211. ACL Press, Jeju Island (2012)

    Google Scholar 

  6. Ding, X., Liao, K., Liu, T., et al.: Event representation learning enhanced with external commonsense knowledge. arXiv preprint arXiv (2019)

    Google Scholar 

  7. Chambers, N., Jurafsky, D.: Unsupervised learning of narrative event chains. In: Proceedings of ACL-08, pp.789–797. ACL Press, Columbus (2008)

    Google Scholar 

  8. Pichotta, K., Mooney, R.: Statistical script learning with multi-argument events. In: Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics, pp. 220–229. ACL Press, Sweden (2014)

    Google Scholar 

  9. Luan, F., Paris, S., Shechtman, E., et al.: Deep photo style transfer. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4990–4998 (2017)

    Google Scholar 

  10. Mueller, J., Thyagarajan, A.: Siamese recurrent architectures for learning sentence similarity. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 30 (2016)

    Google Scholar 

  11. Xie, Q., Dai, Z., Hovy, E., et al.: Unsupervised data augmentation for consistency training. arXiv preprint arXiv (2019)

    Google Scholar 

  12. Coulombe, C.: Text data augmentation made simple by leveraging NLP cloud APIs. arXiv preprint arXiv (2018)

    Google Scholar 

  13. Yan, G., Li, Y., Zhang, S., et al.: Data augmentation for deep learning of judgment documents. In: International Conference on Intelligent Science and Big Data Engineering, pp. 232–242. Springer, Cham (2019)

    Google Scholar 

  14. Li, K., Chen, C., Quan, X., et al.: Conditional Augmentation for Aspect Term Extraction via Masked Sequence-to-Sequence Generation. arXiv preprint arXiv (2020)

    Google Scholar 

  15. Guo, H., Mao, Y., Zhang, R.: Augmenting data with mixup for sentence classification: an empirical study. arXiv preprint arXiv (2019)

    Google Scholar 

  16. Jans, B., Bethard, S., Vulic, I., et al.: Skip n-grams and ranking functions for predicting script events. In: Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics, pp. 336–344. ACL Press, United States (2012)

    Google Scholar 

  17. Granroth-Wilding, M., Clark, S.: What happens next? Event prediction using a compositional neural network model. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 2727–2733. AAAI Press, Palo Alto (2016)

    Google Scholar 

  18. Pichotta, K., Mooney, R.: Learning statistical scripts with LSTM recurrent neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 2800–2806. AAAI Press, Palo Alto (2016)

    Google Scholar 

  19. Lv, S., Qian, W., Huang, L., et al.: Sam-net: integrating event-level and chain-level attentions to predict what happens next. In: Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence and Thirty-First Innovative Applications of Artificial Intelligence Conference and Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, pp. 6802–6809. AAAI Press, Palo Alto (2019)

    Google Scholar 

  20. Li, Z., Ding, X., Liu, T.: Constructing narrative event evolutionary graph for script event prediction. In: Proceedings of the 27th International Joint Conference on Artificial Intelligence, pp. 4201–4207. AAAI Press, Palo Alto (2018)

    Google Scholar 

  21. Wang, Z., Zhang, Y., Chang, C.: Integrating order information and event relation for script event prediction. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 57–67. ACL Press, Copenhagen (2017)

    Google Scholar 

  22. Wang, L., Yue, J., Guo, S., et al.: Multi-level connection enhanced representation learning for script event prediction. In: Proceedings of the Web Conference 2021, pp. 3524–3533. ACM Press, New York (2021)

    Google Scholar 

  23. Balasubramanian, N., Soderland, S., Etzioni, O.: Generating Coherent Event Schemas at Scale. ACL Press, Seattle (2013)

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank the anonymous reviewers for the help comments. This work was supported by the National University of Defense Technology Research Project ZK20-46, the Young Elite Scientists Sponsorship Program (2021-JCJQ-QT-050), the National Natural Science Foundation of China under Grants 61972207, U1836208, U1836110, and 61672290, and the China Postdoctoral Science Foundation funded project (2021MD703983).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kun Ding .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Y. et al. (2023). Improving Event Representation for Script Event Prediction via Data Augmentation and Integration. In: Liu, F., Duan, N., Xu, Q., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2023. Lecture Notes in Computer Science(), vol 14303. Springer, Cham. https://doi.org/10.1007/978-3-031-44696-2_52

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-44696-2_52

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-44695-5

  • Online ISBN: 978-3-031-44696-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics