Processing math: 50%
Bidirectional Spatial-Temporal Adaptive Transformer for Urban Traffic Flow Forecasting | IEEE Journals & Magazine | IEEE Xplore

Bidirectional Spatial-Temporal Adaptive Transformer for Urban Traffic Flow Forecasting


Abstract:

Urban traffic forecasting is the cornerstone of the intelligent transportation system (ITS). Existing methods focus on spatial-temporal dependency modeling, while two int...Show More

Abstract:

Urban traffic forecasting is the cornerstone of the intelligent transportation system (ITS). Existing methods focus on spatial-temporal dependency modeling, while two intrinsic properties of the traffic forecasting problem are overlooked. First, the complexity of diverse forecasting tasks is nonuniformly distributed across various spaces (e.g., suburb versus downtown) and times (e.g., rush hour versus off-peak). Second, the recollection of past traffic conditions is beneficial to the prediction of future traffic conditions. Based on these properties, we propose a bidirectional spatial-temporal adaptive transformer (Bi-STAT) for accurate traffic forecasting. Bi-STAT adopts an encoder–decoder architecture, where both the encoder and the decoder maintain a spatial-adaptive transformer and a temporal-adaptive transformer structure. Inspired by the first property, each transformer is designed to dynamically process the traffic streams according to their task complexities. Specifically, we realize this by the recurrent mechanism with a novel dynamic halting module (DHM). Each transformer performs iterative computation with shared parameters until DHM emits a stopping signal. Motivated by the second property, Bi-STAT utilizes one decoder to perform the present \rightarrow past recollection task and the other decoder to perform the present \rightarrow future prediction task. The recollection task supplies complementary information to assist and regularize the prediction task for a better generalization. Through extensive experiments, we show the effectiveness of each module in Bi-STAT and demonstrate the superiority of Bi-STAT over the state-of-the-art baselines on four benchmark datasets. The code is available at https://github.com/chenchl19941118/Bi-STAT.git.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 34, Issue: 10, October 2023)
Page(s): 6913 - 6925
Date of Publication: 30 June 2022

ISSN Information:

PubMed ID: 35771780

Funding Agency:


References

References is not available for this document.