Skip to main content

TGPPN: A Transformer and Graph Neural Network Based Point Process Network Model

  • Conference paper
  • First Online:
Computer Supported Cooperative Work and Social Computing (ChineseCSCW 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 2012))

  • 242 Accesses

Abstract

The Temporal Point Process (TPP) is applicable in various fields including healthcare, device failure prediction, and social media. It allows for the precise modeling of event occurrences and their associated types, as well as timestamps. Although recent studies have integrated deep learning and reinforcement learning techniques into TPP, most of them focus only on the event sequence without incorporating other fundamental information such as time series information. Moreover, the majority of these studies only predict the information of the next event, which may not be sufficient for practical applications that needs predicting multi-step events. Therefore, we propose the TGPPN model, which employs a Transformer structure to address the multi-step forecasting task, and a graph neural network to handle multi-variable time series in conjunction with event sequences. Our experiments on real-world datasets demonstrate the effectiveness of our proposed model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.kaggle.com/datasets/mchirico/montcoalert.

  2. 2.

    https://www.kaggle.com/datasets/szrlee/stock-time-series-20050101-to-20171231.

  3. 3.

    https://www.kaggle.com/datasets/secareanualin/football-events.

  4. 4.

    https://www.kaggle.com/datasets/mkechinov/ecommerce-events-history-in-cosmetics-shop.

References

  1. Hawkes, A.G.: Spectra of some self-exciting and mutually exciting point processes. Biometrika 58(1), 83–90 (1971)

    Article  MathSciNet  Google Scholar 

  2. Isham, V., Westcott, M.: A self-correcting point process. Stochast. Process. Appl. 8(3), 335–347 (1979)

    Article  MathSciNet  Google Scholar 

  3. Barbour, A.D., Holst, L., Janson, S.: Poisson Approximation, vol. 2. The Clarendon Press, Oxford University Press (1992)

    Book  Google Scholar 

  4. Du, N., Dai, H., Trivedi, R., et al.: Recurrent marked temporal point processes: Embedding event history to vector. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2016)

    Google Scholar 

  5. Zuo, S., Jiang, H., Li, Z., et al.: Transformer Hawkes process. In: International Conference on Machine Learning. PMLR (2020)

    Google Scholar 

  6. Xiao, S., Yan, J., Yang, X., et al.: Modeling the intensity function of point process via recurrent neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 31, no. 1 (2017)

    Google Scholar 

  7. Xiao, S., Yan, J., Farajtabar, M., et al.: Learning time series associated event sequences with recurrent point process networks. IEEE Trans. Neural Netw. Learn. Syst. 30(10), 3124–3136 (2019)

    Article  Google Scholar 

  8. Rasmussen, J.G.: Lecture notes: temporal point processes and the conditional intensity function. arXiv preprint arXiv:1806.00221 (2018)

  9. Shchur, O., Caner TĂĽrkmen, A., Januschowski, T., et al.: Neural temporal point processes: a review. arXiv preprint arXiv:2104.03528 (2021)

  10. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  11. Mei, H., Eisner, J.M.: The neural Hawkes process: a neurally self-modulating multivariate point process. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  12. Ashish, V., Shazeer, N., Parmar, N., et al.: Advances in Neural Information Processing Systems, vol. 30, pp. 5998–6008 (2017)

    Google Scholar 

  13. Yeung, C.C.K., Sit, T., Fujii, K.: Transformer-based neural marked spatio temporal point process model for football match events analysis. arXiv preprint arXiv:2302.09276 (2023)

  14. Erfanian, N., Segarra, S., de Hoop, M.: Neural multi-event forecasting on spatio-temporal point processes using probabilistically enriched transformers. arXiv preprint arXiv:2211.02922 (2022)

  15. Wu, Z., Pan, S., Long, G., et al.: Connecting the dots: multivariate time series forecasting with graph neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (2020)

    Google Scholar 

  16. Wang, C., Qingyun, W., Weimer, M., et al.: FLAML: a fast and lightweight AutoML library. Proc. Mach. Learn. Syst. 3, 434–447 (2021)

    Google Scholar 

  17. Leskovec, J., Krevl, A.: SNAP datasets: Stanford large network dataset collection (2014)

    Google Scholar 

  18. Zhao, Q., Erdogdu, M.A., He, H.Y., et al.: Seismic: a self-exciting point process model for predicting tweet popularity. In: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2015)

    Google Scholar 

  19. Zhang, Y., Yan, J.: Neural relation inference for multi-dimensional temporal point processes via message passing graph. In: IJCAI (2021)

    Google Scholar 

  20. Bontempi, G., Ben Taieb, S., Le Borgne, Y.-A.: Machine learning strategies for time series forecasting. In: Aufaure, M.-A., Zimányi, E. (eds.) eBISS 2012. LNBIP, vol. 138, pp. 62–77. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-36318-4_3

    Chapter  Google Scholar 

  21. Zhang, Q., Lipani, A., Kirnap, O., et al.: Self-attentive Hawkes process. In: International Conference on Machine Learning. PMLR (2020)

    Google Scholar 

Download references

Acknowledgement

This work is partially supported by the Program of Technology Innovation of the Science and Technology Commission of Shanghai Municipality (Granted No. 21511104700 and 21DZ1205000). This work is also supported by China National Science Foundation (Granted Number 62072301).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jian Cao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Qi, Q., Shen, S., Cao, J., Guan, W., Gan, S. (2024). TGPPN: A Transformer and Graph Neural Network Based Point Process Network Model. In: Sun, Y., Lu, T., Wang, T., Fan, H., Liu, D., Du, B. (eds) Computer Supported Cooperative Work and Social Computing. ChineseCSCW 2023. Communications in Computer and Information Science, vol 2012. Springer, Singapore. https://doi.org/10.1007/978-981-99-9637-7_17

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-9637-7_17

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-9636-0

  • Online ISBN: 978-981-99-9637-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics