Skip to main content

Cross-Sentence Temporal Relation Extraction with Relative Sentence Time

  • Conference paper
  • First Online:
  • 1677 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13368))

Abstract

Event temporal relation is capable to detect event evolution and plays an important role in natural language processing. Many recent studies which employ pre-trained language models have shown prominent performance improvement. However, due to more complex context, these approaches usually perform poorly when two events are not within the same sentence.

Thus in this paper, we propose a cross-sentence temporal relation extraction model which incorporates the prediction of temporal relations between sentences to enhance the performance of temporal event relation extraction. A multi-task learning framework is adopted by integrating the temporal relation classifier with an auxiliary task to predict the temporal order of the sentences. In addition, to deal with the problem of class-imbalanced data, we propose a sub-sampling method by decreasing the number of Vague relations. Compared to the baseline model, extensive experiments show that our model is capable to enhance the performance of cross-sentence temporal relation extraction while achieving state-of-the-art results on TimeBank-Dense, MATRES, and TCR dataset.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    The result of TimeBank Dense is reproduced using the same method by ourselves and the cited paper didn’t experiment on this dataset

References

  1. Ning, Q., Wu, H., Han, R., Peng, N., Gardner, M., Roth, D.: Torque: a reading comprehension dataset of temporal ordering questions. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, pp. 1158–1172 (2020)

    Google Scholar 

  2. Leeuwenberg, A., Moens, M.F.: Temporal information extraction by predicting relative time-lines. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 1237–1246 (2018)

    Google Scholar 

  3. Ning, Q., Feng, Z., Roth, D.: A structured learning approach to temporal relation extraction. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 1027–1037. Association for Computational Linguistics (2017)

    Google Scholar 

  4. Cassidy, T., McDowell, B., Chambers, N., Bethard, S.: An annotation framework for dense event ordering. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, pp. 501–506 (2014)

    Google Scholar 

  5. Zhou, Y., et al.: Clinical temporal relation extraction with probabilistic soft logic regularization and global inference. In: Proceedings of the AAAI Conference on Artificial Intelligence (2021)

    Google Scholar 

  6. Ning, Q., Wu, H., Roth, D.: A multi-axis annotation scheme for event temporal relations. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol. 1: Long Papers, pp. 1318–1328 (2018)

    Google Scholar 

  7. Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: Albert: a lite bert for self-supervised learning of language representations (2019)

    Google Scholar 

  8. Wen, H., Ji, H.: Utilizing relative event time to enhance event-event temporal relation extraction. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 10431–10437 (2021)

    Google Scholar 

  9. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  10. Han, R., Ning, Q., Peng, N.: Joint event and temporal relation extraction with shared representations and structured prediction. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, pp. 434–444 (2019)

    Google Scholar 

  11. Mathur, P., Jain, R., Dernoncourt, F., Morariu, V., Tran, Q.H., Manocha, D.: Timers: Document-level temporal relation extraction. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, vol. 2: Short Papers, pp. 524–533 (2021)

    Google Scholar 

  12. Lin, C., Miller, T., Dligach, D., Bethard, S., Savova, G.: A bert-based universal model for both within-and cross-sentence clinical temporal relation extraction. In: Proceedings of the 2nd Clinical Natural Language Processing Workshop, pp. 65–71 (2019)

    Google Scholar 

  13. Han, R., Hsu, I.H., Yang, M., Galstyan, A., Weischedel, R., Peng, N.: Deep structured neural network for event temporal relation extraction. In: Proceedings of the 23rd Conference on Computational Natural Language Learning, pp. 666–106 (2019)

    Google Scholar 

  14. Ning, Q., Subramanian, S., Roth, D.: An improved neural baseline for temporal relation extraction. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, pp. 6203–6209 (2019)

    Google Scholar 

  15. Wang, H., Chen, M., Zhang, H., Roth, D.: Joint constrained learning for event-event relation extraction. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, pp. 696–706 (2020)

    Google Scholar 

  16. Ma, M.D., et al.: Eventplus: A temporal event understanding pipeline. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Demonstrations, pp. 56–65 (2021)

    Google Scholar 

  17. Tan, X., Pergola, G., He, Y.: Extracting event temporal relations via hyperbolic geometry. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 8065–8077 (2021)

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Guangdong Province Science and Technology Project 2021A0505080015.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xinning Zhu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xie, P., Zhu, X., Zhang, C., Hu, Z., Yang, G. (2022). Cross-Sentence Temporal Relation Extraction with Relative Sentence Time. In: Memmi, G., Yang, B., Kong, L., Zhang, T., Qiu, M. (eds) Knowledge Science, Engineering and Management. KSEM 2022. Lecture Notes in Computer Science(), vol 13368. Springer, Cham. https://doi.org/10.1007/978-3-031-10983-6_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-10983-6_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-10982-9

  • Online ISBN: 978-3-031-10983-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics