Skip to main content

Temporal Attention-Based Graph Convolution Network for Taxi Demand Prediction in Functional Areas

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12937))

Abstract

Shared travel is increasingly becoming an indispensable way of urban transportation. Accurately calculating the demand for taxis in various regions of the city has become a huge problem. In this paper, we divide the city into multiple lattices of different sizes and propose a graph convolution network based on the temporal attention mechanism for taxi demand prediction in each functional area of the city. The model includes graph convolution network (GCN), temporal convolution network (TCN), and the attention mechanism, which are respectively used to capture the spatial correlation of roads, time dependence, and highlight the characteristics of the time-series data. Extensive experiments on three datasets validate the effectiveness of the proposed method, compared against several state-of-the-art methods. Despite there are amount differences among the three datasets in our experiment, our model still has a high prediction accuracy. Our model code is available at https://github.com/qdu318/TAGCN.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Puschmann, T., Alt, R.: Sharing economy. Bus. Inf. Syst. Eng. 58(1), 93–99 (2016)

    Article  Google Scholar 

  2. Hurvich, C.M., Tsai, C.L.: Regression and time series model selection in small samples. Biometrika 76(2), 297–307 (1989)

    Article  MathSciNet  Google Scholar 

  3. Saboia, J.L.M.: Autoregressive integrated moving average (ARIMA) models for birth forecasting. J. Am. Stat. Assoc. 72(358), 264–270 (1977)

    Article  Google Scholar 

  4. Maydeu-Olivares, A., Shi, D., Fairchild, A.J.: Estimating causal effects in linear regression models with observational data: the instrumental variables regression model. Psychol. Methods 25(2), 243 (2020)

    Article  Google Scholar 

  5. Patra, A.K.: Adaptive kalman filtering model predictive controller design for stabilizing and trajectory tracking of inverted pendulum. J. Inst. Eng. (India) Ser. B 101(6), 677–688 (2020)

    Google Scholar 

  6. Saadatfar, H., Khosravi, S., Joloudari, J.H.: A new K-nearest neighbors classifier for big data based on efficient data pruning. Mathematics 8(2), 286 (2020)

    Article  Google Scholar 

  7. Yariyan, P., Janizadeh, S., Van Phong, T.: Improvement of best first decision trees using bagging and dagging ensembles for flood probability mapping. Water Resour. Manage. 34(9), 3037–3053 (2020)

    Article  Google Scholar 

  8. Jiang, L., Zhang, L., Yu, L.: Class-specific attribute weighted naive Bayes. Pattern Recogn. 88, 321–330 (2019)

    Article  Google Scholar 

  9. Huang, Y., Zhao, L.: Review on landslide susceptibility mapping using support vector machines. CATENA 165, 520–529 (2018)

    Article  Google Scholar 

  10. Van Gerven, M., Bohte, S.: Artificial neural networks as models of neural information processing. Front. Comput. Neurosci. 11, 114 (2017)

    Article  Google Scholar 

  11. He, Z., Shao, H., Wang, P.: Deep transfer multi-wavelet auto-encoder for intelligent fault diagnosis of gearbox with few target training samples. Knowl. Based Syst. 191, 105313 (2020)

    Google Scholar 

  12. Goodfellow, I.J., et al.: Generative adversarial networks. arXiv preprint arXiv:1406.2661 (2014)

  13. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Adv. Neural. Inf. Process. Syst. 25, 1097–1105 (2012)

    Google Scholar 

  14. Zaremba, W., Sutskever, I., Vinyals, O.: Recurrent neural network regularization. arXiv preprint arXiv:1409.2329 (2014)

  15. Liu, L., Qiu, Z., Li, G., Wang, Q., Ouyang, W., Lin, L.: Contextualized spatial–temporal network for taxi origin-destination demand prediction. IEEE Trans. Intell. Transp. Syst. 20(10), 3875–3887 (2019)

    Article  Google Scholar 

  16. Yao, H., et al.: Deep multi-view spatial-temporal network for taxi demand prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32, (2018)

    Google Scholar 

  17. Yao, H., Tang, X., Wei, H., Zheng, G., Li, Z.: Revisiting spatial-temporal similarity: A deep learning framework for traffic prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 5668–5675 (2019)

    Google Scholar 

  18. Van Hoai, D.P., Duong, H.-T., Hoang, V.T.: Text recognition for Vietnamese identity card based on deep features network. Int. J. Doc. Anal. Recogn. (IJDAR) 24(1–2), 123–131 (2021). https://doi.org/10.1007/s10032-021-00363-7

    Article  Google Scholar 

  19. Liu, Y., Gu, J., Goyal, N.: Multilingual denoising pre-training for neural machine translation. Trans. Assoc. Comput. Linguist. 8, 726–742 (2020)

    Article  Google Scholar 

  20. Lv, Z., Li, J., Dong, C., Zhao, W.: A deep spatial-temporal network for vehicle trajectory prediction. In: Yu, D., Dressler, F., Yu, J. (eds.) WASA 2020. LNCS, vol. 12384, pp. 359–369. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-59016-1_30

    Chapter  Google Scholar 

  21. Cai, Z., Zheng, X., Yu, J.: A differential-private framework for urban traffic flows estimation via taxi companies. IEEE Trans. Industr. Inf. 15(12), 6492–6499 (2019)

    Article  Google Scholar 

  22. Michaël, D., Xavier, B., Pierre, V.: Convolutional neural networks on graphs with fast localized spectral filtering. Neural Inf. Process. Syst. 3(1), 1–9 (2016)

    Google Scholar 

  23. Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2008)

    Article  Google Scholar 

  24. Perozzi, B., Al-Rfou, R., Skiena, S.: Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 701–710 (2014)

    Google Scholar 

  25. Grover, A., Leskovec, J.: Node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 855–864 (2016)

    Google Scholar 

  26. Cai, Z., Zheng, X.: A private and efficient mechanism for data uploading in smart cyber-physical systems. IEEE Trans. Netw. Sci. Eng. 7(2), 766–775 (2018)

    Article  MathSciNet  Google Scholar 

  27. David Cruz-Uribe, S.F.O., Moen, K.: One and two weight norm inequalities for Riesz potentials. Ill. J. Math. 57(1), 295–323 (2013)

    MathSciNet  MATH  Google Scholar 

  28. Bai, S., Kolter, J.Z., Koltun, V.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271 (2018)

  29. Sepp, H., Jürgen, S.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  30. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)

  31. Ling, Z., et al.: T-GCN: a temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst. 21(9), 3848–3858 (2019)

    Google Scholar 

  32. Bing, Y., Haoteng, Y., Zhanxing, Z.: Spatiotemporal graph convolutional networks: a deep learning framework for traffic prediction. Int. Jt. Conf. Artif. Intell. Organ. 4(1), 3634–3640 (2017)

    Google Scholar 

  33. Guo, S., Lin, Y., Feng, N., Song, C., Wan, H.: Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, no. 01, pp. 922–929 (2019)

    Google Scholar 

  34. Qing, G., Zhu, S., Jie, Z., Yinleng, T.: An attentional recurrent neural network for personalized next location recommendation. In: AAAI Conference on Artificial Intelligence (AAAI 34), pp. 83–90 New York (2020)

    Google Scholar 

  35. Mason, J.C., Handscomb, D.C.: Chebyshev Polynomials. CRC Press (2002)

    Google Scholar 

Download references

Acknowledgments

This research was supported in part by Shandong Province colleges and universities youth innovation technology plan innovation team project under Grant No. 2020KJN011, Shandong Provincial Natural Science Foundation under Grant No. ZR2020MF060, Program for Innovative Postdoctoral Talents in Shandong Province under Grant No. 40618030001, National Natural Science Foundation of China under Grant No. 61802216, and Postdoctoral Science Foundation of China under Grant No.2018M642613.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jianbo Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, Y., Li, J., Zhao, A., Lv, Z., Lu, G. (2021). Temporal Attention-Based Graph Convolution Network for Taxi Demand Prediction in Functional Areas. In: Liu, Z., Wu, F., Das, S.K. (eds) Wireless Algorithms, Systems, and Applications. WASA 2021. Lecture Notes in Computer Science(), vol 12937. Springer, Cham. https://doi.org/10.1007/978-3-030-85928-2_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85928-2_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85927-5

  • Online ISBN: 978-3-030-85928-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics