Skip to main content

MSK-Net: Multi-source Knowledge Base Enhanced Networks for Script Event Prediction

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2022)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1794))

Included in the following conference series:

  • 597 Accesses

Abstract

Script event prediction (SEP) aims to choose a correct subsequent event from a candidate list, according to a chain of ordered context events. It is easy for human but difficult for machine to perform such event reasoning. The reason is that human have relevant commonsense knowledge. If we supplement this knowledge from external knowledge bases, machine may be able to improve the reasoning ability. To this end, we introduce a novel approach, named MSK-Net, which consists of Question Encoder, Knowledge Searcher, Knowledge Encoder and Result Predictor. As far as we know, this is the first model utilizing multi-source knowledge to solve SEP problem. Specifically, first we use Question Encoder to encode the question, including candidate event to be judged and context events, focusing on intra-event contextualization and inter-event order information modelling. Then, we use Knowledge Searcher to retrieve relevant knowledge from multi-source knowledge bases (such as ASER and ATOMIC). Third, Knowledge Encoder is used to encode the knowledge retrieved in the second step. Last, Result Predictor gives the final prediction. Experiments on the widely-used multiple choice narrative cloze (MCNC) task demonstrate our approach achieves state-of-the-art performance compared to other methods. Also, it is worth noting that MSK-Net without external knowledge is still very competitive.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://hkust-knowcomp.github.io/ASER/html/index.html (we use the core version of ASER 1.0).

  2. 2.

    https://homes.cs.washington.edu/~msap/atomic/ (we use the aggregated data v4_atomic_all_agg.csv).

  3. 3.

    There are 15 relations in ASER data and 5 relation types selected in ATOMIC data.

References

  1. Chambers, N., Jurafsky, D.: Unsupervised learning of narrative event chains. In: ACL 2008, pp. 789–797. The Association for Computer Linguistics (2008)

    Google Scholar 

  2. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT 2019, pp. 4171–4186. Association for Computational Linguistics (2019)

    Google Scholar 

  3. Ding, X., Liao, K., Liu, T., Li, Z., Duan, J.: Event representation learning enhanced with external commonsense knowledge. In: EMNLP-IJCNLP 2019, pp. 4893–4902. Association for Computational Linguistics (2019)

    Google Scholar 

  4. Granroth-Wilding, M., Clark, S.: What happens next? Event prediction using a compositional neural network model. In: AAAI 2016, pp. 2727–2733. AAAI Press (2016)

    Google Scholar 

  5. Jans, B., Bethard, S., Vulic, I., Moens, M.: Skip N-grams and ranking functions for predicting script events. In: EACL 2012, pp. 336–344. The Association for Computer Linguistics (2012)

    Google Scholar 

  6. Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: ALBERT: a lite BERT for self-supervised learning of language representations. In: ICLR 2020 (2020)

    Google Scholar 

  7. Lee, I., Goldwasser, D.: FEEL: featured event embedding learning. In: AAAI 2018, pp. 4840–4847. AAAI Press (2018)

    Google Scholar 

  8. Li, Z., Ding, X., Liu, T.: Constructing narrative event evolutionary graph for script event prediction. In: IJCAI 2018, pp. 4201–4207. ijcai.org (2018)

  9. Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. CoRR abs/1907.11692 (2019)

    Google Scholar 

  10. Lv, S., Qian, W., Huang, L., Han, J., Hu, S.: SAM-Net: integrating event-level and chain-level attentions to predict what happens next. In: AAAI 2019, pp. 6802–6809. AAAI Press (2019)

    Google Scholar 

  11. Lv, S., Zhu, F., Hu, S.: Integrating external event knowledge for script learning. In: COLING 2020, pp. 306–315. International Committee on Computational Linguistics (2020)

    Google Scholar 

  12. Pichotta, K., Mooney, R.J.: Statistical script learning with multi-argument events. In: EACL 2014, pp. 220–229. The Association for Computer Linguistics (2014)

    Google Scholar 

  13. Pichotta, K., Mooney, R.J.: Learning statistical scripts with LSTM recurrent neural networks. In: AAAI 2016, pp. 2800–2806. AAAI Press (2016)

    Google Scholar 

  14. Rashkin, H., Sap, M., Allaway, E., Smith, N.A., Choi, Y.: Event2Mind: Commonsense inference on events, intents, and reactions. In: ACL 2018, pp. 463–473. Association for Computational Linguistics (2018)

    Google Scholar 

  15. Sap, M., et al.: ATOMIC: an atlas of machine commonsense for if-then reasoning. In: AAAI 2019, pp. 3027–3035. AAAI Press (2019)

    Google Scholar 

  16. Schank, R.C., Abelson, R.P.: Scripts, plans, goals and understanding: an inquiry into human knowledge structures. Technical report (1977)

    Google Scholar 

  17. Wang, L., et al.: Multi-level connection enhanced representation learning for script event prediction. In: WWW 2021, pp. 3524–3533. ACM/IW3C2 (2021)

    Google Scholar 

  18. Wang, Z., Zhang, Y., Chang, C.: Integrating order information and event relation for script event prediction. In: EMNLP 2017, pp. 57–67. Association for Computational Linguistics (2017)

    Google Scholar 

  19. Zhang, H., Liu, X., Pan, H., Song, Y., Leung, C.W.: ASER: a large-scale eventuality knowledge graph. In: WWW 2020, pp. 201–211. ACM/IW3C2 (2020)

    Google Scholar 

  20. Zhang, L., Zhou, D., He, Y., Yang, Z.: MERL: multimodal event representation learning in heterogeneous embedding spaces. In: AAAI 2021, pp. 14420–14427. AAAI Press (2021)

    Google Scholar 

Download references

Acknowledgements

We would like to thank Jingqi Suo for supporting our script learning related research, and the anonymous reviewers for their valuable comments and suggestions that help improving the quality of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daren Zha .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, S., Zha, D., Xue, C. (2023). MSK-Net: Multi-source Knowledge Base Enhanced Networks for Script Event Prediction. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Communications in Computer and Information Science, vol 1794. Springer, Singapore. https://doi.org/10.1007/978-981-99-1648-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-1648-1_6

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-1647-4

  • Online ISBN: 978-981-99-1648-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics