Skip to main content

Improving Event Representation with Supervision from Available Semantic Resources

  • Conference paper
  • First Online:
Database Systems for Advanced Applications (DASFAA 2023)

Abstract

Learning distributed representations of events is an indispensable but challenging task for event understanding. Existing studies address this problem by either composing the embeddings of event arguments as well as their attributes, or exploiting various relations between events like co-occurrence and discourse relations. In this paper we argue that the knowledge learned from sentence embeddings and word semantic meanings could be leveraged to produce superior event embeddings. Specifically, we utilize both natural language inference datasets for learning sentence embeddings and the knowledge base WordNet for word semantics. We propose a Multi-Level Supervised Contrastive Learning model (MLSCL) for learning event representations. Our model fuses diverse semantic resources at the levels of sentences, events, and words in an end-to-end way. We conduct comprehensive experiments on three similarity tasks and one script prediction task. Experimental results show that MLSCL achieves new state-of-the-art performances on all tasks consistently and higher training efficiency than prior competitive model SWCC.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ding, X., Zhang, Y., Liu, T., Duan, J.: Deep learning for event-driven stock prediction. In: Twenty-Fourth International Joint Conference on Artificial Intelligence (2015)

    Google Scholar 

  2. Modi, A.: Event embeddings for semantic script modeling. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 75–83 (2016)

    Google Scholar 

  3. Ding, X., Zhang, Y., Liu, T., Duan, J.: Knowledge-driven event embedding for stock prediction. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 2133–2142 (2016)

    Google Scholar 

  4. Ding, X., Liao, K., Liu, T., Li, Z., Duan, J.: Event representation learning enhanced with external commonsense knowledge. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 4894–4903 (2019)

    Google Scholar 

  5. Weber, N., Balasubramanian, N., Chambers, N.: Event representations with tensor-based compositions. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)

    Google Scholar 

  6. Lee, I.-T., Goldwasser, D.: Feel: featured event embedding learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)

    Google Scholar 

  7. Wang, Z., Zhang, Y., Chang, C.Y.: Integrating order information and event relation for script event prediction. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 57–67 (2017)

    Google Scholar 

  8. Lee, I.-T., Goldwasser, D.: Multi-relational script learning for discourse relations. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 4214–4226 (2019)

    Google Scholar 

  9. Gao, J., Wang, W., Yu, C., Zhao, H., Ng, W., Xu, R.: Improving event representation via simultaneous weakly supervised contrastive learning and clustering. arXiv preprint arXiv:2203.07633 (2022)

  10. Gao, T., Yao, X., Chen, D.: SimCSE: simple contrastive learning of sentence embeddings. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 6894–6910 (2021)

    Google Scholar 

  11. Lv, S., Zhu, F., Hu, S.: Integrating external event knowledge for script learning. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 306–315 (2020)

    Google Scholar 

  12. Zhou, Y., Geng, X., Shen, T., Pei, J., Zhang, W., Jiang, D.: Modeling event-pair relations in external knowledge graphs for script reasoning. In: Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pp. 4586–4596 (2021)

    Google Scholar 

  13. Li, Z., Ding, X., Liu, T.: Constructing narrative event evolutionary graph for script event prediction. arXiv preprint arXiv:1805.05081 (2018)

  14. Bai, L., Guan, S., Guo, J., Li, Z., Jin, X., Cheng, X.: Integrating deep event-level and script-level information for script event prediction. arXiv preprint arXiv:2110.15706 (2021)

  15. Zheng, J., Cai, F., Chen, H.: Incorporating scenario knowledge into a unified fine-tuning architecture for event representation. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 249–258 (2020)

    Google Scholar 

  16. Wang, L., et al.: Multi-level connection enhanced representation learning for script event prediction. In: Proceedings of the Web Conference 2021, pp. 3524–3533 (2021)

    Google Scholar 

  17. Lv, S., Qian, W., Huang, L., Han, J., Songlin, H.: Sam-net: integrating event-level and chain-level attentions to predict what happens next. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 6802–6809 (2019)

    Google Scholar 

  18. Zheng, J., Cai, F., Ling, Y., Chen, H.: Heterogeneous graph neural networks to predict what happen next. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 328–338 (2020)

    Google Scholar 

  19. Zheng, J., Cai, F., Liu, J., Ling, Y., Chen, H.: Multistructure contrastive learning for pretraining event representation. IEEE Trans. Neural Netw. Learn. Syst. (2022)

    Google Scholar 

  20. Miller, G.A.: WordNet: An Electronic Lexical Database. MIT Press, Cambridge (1998)

    Google Scholar 

  21. Loureiro, D., Jorge, A.: Language modelling makes sense: propagating representations through wordnet for full-coverage word sense disambiguation. arXiv preprint arXiv:1906.10007 (2019)

  22. Chambers, N., Jurafsky, D.: Unsupervised learning of narrative event chains. In: Proceedings of ACL-08: HLT, pp. 789–797 (2008)

    Google Scholar 

  23. Jans, B., Bethard, S., Vulic, I., Moens, M.-F: Skip n-grams and ranking functions for predicting script events. In: Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2012), pp. 336–344. ACL, East Stroudsburg, PA (2012)

    Google Scholar 

  24. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liangjun Zang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wei, S., Zang, L., Zhang, X., Song, X., Hu, S. (2023). Improving Event Representation with Supervision from Available Semantic Resources. In: Wang, X., et al. Database Systems for Advanced Applications. DASFAA 2023. Lecture Notes in Computer Science, vol 13945. Springer, Cham. https://doi.org/10.1007/978-3-031-30675-4_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-30675-4_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-30674-7

  • Online ISBN: 978-3-031-30675-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics