Skip to main content

PGTNet: A Process Graph Transformer Network for Remaining Time Prediction of Business Process Instances

  • Conference paper
  • First Online:
Advanced Information Systems Engineering (CAiSE 2024)

Abstract

We present PGTNet, an approach that transforms event logs into graph datasets and leverages graph-oriented data for training Process Graph Transformer Networks to predict the remaining time of business process instances. PGTNet consistently outperforms state-of-the-art deep learning approaches across a diverse range of 20 publicly available real-world event logs. Notably, our approach is most promising for highly complex processes, where existing deep learning approaches encounter difficulties stemming from their limited ability to learn control-flow relationships among process activities and capture long-range dependencies. PGTNet addresses these challenges, while also being able to consider multiple process perspectives during the learning process.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/keyvan-amiri/PGTNet.

References

  1. Augusto, A., Mendling, J., Vidgof, M., Wurm, B.: The connection between process complexity of event sequences and models discovered by process mining. Inf. Sci. 598, 196–215 (2022)

    Article  Google Scholar 

  2. Bukhsh, Z.A., Saeed, A., Dijkman, R.M.: ProcessTransformer: predictive business process monitoring with transformer network (2021). arXiv:2104.00721

  3. Duong, L.T., Travé-Massuyès, L., Subias, A., Merle, C.: Remaining cycle time prediction with Graph Neural Networks for Predictive Process Monitoring. In: International Conference on Machine Learning Technologies (ICMLT). ACM (2023)

    Google Scholar 

  4. Dwivedi, V.P., Luu, A.T., Laurent, T., Bengio, Y., Bresson, X.: Graph neural networks with learnable structural and positional representations (2022). arXiv:2110.07875

  5. Evermann, J., Rehse, J.R., Fettke, P.: Predicting process behaviour using deep learning. Decis. Support Syst. 100, 129–140 (2017)

    Article  Google Scholar 

  6. Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: Precup, D., Teh, Y.W. (eds.) Proceedings of the 34th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 70, pp. 1263–1272. PMLR (2017)

    Google Scholar 

  7. Harl, M., Weinzierl, S., Stierle, M., Matzner, M.: Explainable predictive business process monitoring using gated graph neural networks. J. Decis. Syst. 29(sup1), 312–327 (2020)

    Article  Google Scholar 

  8. He, T., Zhang, Z., Zhang, H., Zhang, Z., Xie, J., Li, M.: Bag of tricks for image classification with convolutional neural networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2019)

    Google Scholar 

  9. Hu, W., et al.: Strategies for pre-training graph neural networks (2020). arXiv:1905.12265

  10. Kreuzer, D., Beaini, D., Hamilton, W.L., Létourneau, V., Tossou, P.: Rethinking graph transformers with spectral attention (2021). arXiv:2106.03893

  11. Lim, D., et al.: Sign and basis invariant networks for spectral graph representation learning (2022). arXiv:2202.13013

  12. Loshchilov, I., Hutter, F.: Decoupled weight decay regularization (2019). arXiv:1711.05101

  13. Navarin, N., Vincenzi, B., Polato, M., Sperduti, A.: LSTM networks for data-aware remaining time prediction of business process instances. In: 2017 IEEE Symposium series on Computational Intelligence (SSCI), pp. 1–7 (2017)

    Google Scholar 

  14. Polato, M., Sperduti, A., Burattin, A., de Leoni, M.: Data-aware remaining time prediction of business process instances. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 816–823 (2014)

    Google Scholar 

  15. Rama-Maneiro, E., Vidal, J.C., Lama, M.: Deep learning for predictive business process monitoring: review and benchmark. IEEE Trans. Serv. Comput. 16(1), 739–756 (2023)

    Google Scholar 

  16. Rampášek, L., Galkin, M., Dwivedi, V.P., Luu, A.T., Wolf, G., Beaini, D.: Recipe for a general, powerful, scalable graph transformer. In: NeurIPS, vol. 35, pp. 14501–14515 (2022)

    Google Scholar 

  17. Tax, N., Verenich, I., La Rosa, M., Dumas, M.: Predictive business process monitoring with LSTM neural networks. In: Dubois, E., Pohl, K. (eds.) CAiSE 2017. LNCS, vol. 10253, pp. 477–492. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-59536-8_30

    Chapter  Google Scholar 

  18. Vaswani, A., et al.: Attention is all you need. In: NeurIPS, vol. 30. Curran Associates, Inc. (2017)

    Google Scholar 

  19. Venugopal, I., Töllich, J., Fairbank, M., Scherp, A.: A comparison of deep-learning methods for analysing and predicting business processes. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1–8 (2021)

    Google Scholar 

  20. Verenich, I., Dumas, M., Rosa, M.L., Maggi, F.M., Teinemaa, I.: Survey and cross-benchmark comparison of remaining time prediction methods in business process monitoring. ACM Trans. Intell. Syst. Technol. 10(4), 34:1–34:34 (2019)

    Google Scholar 

  21. Vidgof, M., Wurm, B., Mendling, J.: The impact of process complexity on process performance: a study using event log data (2023). arXiv:2307.06106

  22. Weinzierl, S.: Exploring gated graph sequence neural networks for predicting next process activities. In: Marrella, A., Weber, B. (eds.) BPM 2021. LNBIP, vol. 436, pp. 30–42. Springer, Cham (2022). https://doi.org/10.1007/978-3-030-94343-1_3

    Chapter  Google Scholar 

  23. Weytjens, H., De Weerdt, J.: Creating unbiased public benchmark datasets with data leakage prevention for predictive process monitoring. In: Marrella, A., Weber, B. (eds.) BPM 2021. LNBIP, vol. 436, pp. 18–29. Springer, Cham (2022). https://doi.org/10.1007/978-3-030-94343-1_2

    Chapter  Google Scholar 

  24. Ying, C., et al.: Do transformers really perform badly for graph representation? In: NeurIPS, vol. 34, pp. 28877–28888. Curran Associates, Inc. (2021)

    Google Scholar 

  25. Zaheer, M., Kottur, S., Ravanbakhsh, S., Poczos, B., Salakhutdinov, R.R., Smola, A.J.: Deep sets. In: NeurIPS, vol. 30. Curran Associates, Inc. (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Keyvan Amiri Elyasi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Amiri Elyasi, K., van der Aa, H., Stuckenschmidt, H. (2024). PGTNet: A Process Graph Transformer Network for Remaining Time Prediction of Business Process Instances. In: Guizzardi, G., Santoro, F., Mouratidis, H., Soffer, P. (eds) Advanced Information Systems Engineering. CAiSE 2024. Lecture Notes in Computer Science, vol 14663. Springer, Cham. https://doi.org/10.1007/978-3-031-61057-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-61057-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-61056-1

  • Online ISBN: 978-3-031-61057-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics