Skip to main content

Efficient Spatio-Temporal Graph Neural Networks for Traffic Forecasting

  • Conference paper
  • First Online:
Artificial Intelligence Applications and Innovations (AIAI 2023)

Abstract

Urban Traffic Forecasting has recently seen a lot of research activity as it entails a compelling combination of multivariate temporal data with geo-spatial dependencies between multiple data collection sensors. Current top approaches to this task tend to use costly spatio-temporal pipelines, where the model complexities typically have linear dependency on the time-series length and quadratic on the number of nodes. In this paper, we propose a number of steps to dramatically improve the runtime efficiency of the traffic forecasting solutions. First, we use a temporal pooling stack prior to spatial processing to effectively eliminate the time dimension before applying the spatial components. This removes the linear dependency of the model on the length of the time series. Second, we construct learnable graph pooling blocks inside the spatial stack which progressively reduce the size of the graph and facilitate better data flow between far away nodes. Experimental results on the standard METR-LA and PEMSBAY benchmarks show that the proposed approach yields significant inference and training speedups of up to x5 in the 1-h prediction task and x27 in the 24-h prediction task, while keeping or surpassing the state-of-the-art results. Our findings call into question the need for time-consuming spatio-temporal processing blocks, used in many of latest solutions for the traffic forecasting task.

Y. Lubarsky and A. Gaissinski—Equal contribution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Badrinarayanan, V., Kendall, A., Cipolla, R.: Segnet: a deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017)

    Google Scholar 

  2. Bai, L., Yao, L., Li, C., Wang, X., Wang, C.: Adaptive graph convolutional recurrent network for traffic forecasting. Adv. Neural. Inf. Process. Syst. 33, 17804–17815 (2020)

    Google Scholar 

  3. Cho, K., Van Merriënboer, B., Bahdanau, D., Bengio, Y.: On the properties of neural machine translation: encoder-decoder approaches. arXiv preprint arXiv:1409.1259 (2014)

  4. Dai, H., Dai, B., Song, L.: Discriminative embeddings of latent variable models for structured data. In: International Conference on Machine Learning, pp. 2702–2711. PMLR (2016)

    Google Scholar 

  5. Dauphin, Y.N., Fan, A., Auli, M., Grangier, D.: Language modeling with gated convolutional networks. In: International Conference on Machine Learning, pp. 933–941. PMLR (2017)

    Google Scholar 

  6. Gao, H., Ji, S.: Graph u-nets. In: International Conference on Machine Learning, pp. 2083–2092. PMLR (2019)

    Google Scholar 

  7. Gers, F.A., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. Neural Comput. 12(10), 2451–2471 (2000)

    Google Scholar 

  8. Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272. PMLR (2017)

    Google Scholar 

  9. Grattarola, D., Zambon, D., Bianchi, F.M., Alippi, C.: Understanding pooling in graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. (2022)

    Google Scholar 

  10. Heinrich, K., Zschech, P., Janiesch, C., Bonin, M.: Process data properties matter: introducing gated convolutional neural networks (GCNN) and key-value-predict attention networks (KVP) for next event prediction with deep learning. Decis. Support Syst. 143, 113494 (2021)

    Google Scholar 

  11. Jiang, R., et al.: DL-Traff: survey and benchmark of deep learning models for urban traffic prediction. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 4515–4525 (2021)

    Google Scholar 

  12. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  13. Lam, R., et al.: Graphcast: learning skillful medium-range global weather forecasting. arXiv preprint arXiv:2212.12794 (2022)

  14. Li, Y., Yu, R., Shahabi, C., Liu, Y.: Diffusion convolutional recurrent neural network: data-driven traffic forecasting. In: International Conference on Learning Representations (ICLR 2018) (2018)

    Google Scholar 

  15. Li, Y., Tarlow, D., Brockschmidt, M., Zemel, R.: Gated graph sequence neural networks. arXiv preprint arXiv:1511.05493 (2015)

  16. Liu, C., et al.: Graph pooling for graph neural networks: progress, challenges, and opportunities. arXiv preprint arXiv:2204.07321 (2022)

  17. Mesquita, D., Souza, A., Kaski, S.: Rethinking pooling in graph neural networks. Adv. Neural. Inf. Process. Syst. 33, 2220–2231 (2020)

    Google Scholar 

  18. Oreshkin, B.N., Amini, A., Coyle, L., Coates, M.: FC-GAGA: fully connected gated graph architecture for spatio-temporal traffic forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 9233–9241 (2021)

    Google Scholar 

  19. Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28

    Google Scholar 

  20. Rusek, K., Suárez-Varela, J., Almasan, P., Barlet-Ros, P., Cabellos-Aparicio, A.: Routenet: leveraging graph neural networks for network modeling and optimization in SDN. IEEE J. Sel. Areas Commun. 38(10), 2260–2270 (2020)

    Google Scholar 

  21. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  22. Wu, Z., Pan, S., Long, G., Jiang, J., Zhang, C.: Graph wavenet for deep spatial-temporal graph modeling. In: The 28th International Joint Conference on Artificial Intelligence (IJCAI). International Joint Conferences on Artificial Intelligence Organization (2019)

    Google Scholar 

  23. Wu, Z., Pan, S., Long, G., Jiang, J., Chang, X., Zhang, C.: Connecting the dots: multivariate time series forecasting with graph neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 753–763 (2020)

    Google Scholar 

  24. Zhao, L., et al.: T-GCN: a temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst. 21(9), 3848–3858 (2019)

    Google Scholar 

  25. Zheng, C., Fan, X., Wang, C., Qi, J.: GMAN: a graph multi-attention network for traffic prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 1234–1241 (2020)

    Google Scholar 

  26. Zhou, J., et al.: Graph neural networks: a review of methods and applications. AI Open 1, 57–81 (2020)

    Google Scholar 

  27. Zhu, L.: THOP: PyTorch-OpCounter (2019). https://github.com/Lyken17/pytorch-OpCounter

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yackov Lubarsky .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lubarsky, Y., Gaissinski, A., Kisilev, P. (2023). Efficient Spatio-Temporal Graph Neural Networks for Traffic Forecasting. In: Maglogiannis, I., Iliadis, L., MacIntyre, J., Dominguez, M. (eds) Artificial Intelligence Applications and Innovations. AIAI 2023. IFIP Advances in Information and Communication Technology, vol 676. Springer, Cham. https://doi.org/10.1007/978-3-031-34107-6_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-34107-6_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-34106-9

  • Online ISBN: 978-3-031-34107-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics