Abstract
Tensor time series (TTS) data, a generalization of one-dimensional time series on a high-dimensional space, is ubiquitous in real-world applications. Compared to modeling time series or multivariate time series, which has received much attention and achieved tremendous progress in recent years, tensor time series has been paid less effort. However, properly coping with the TTS is a much more challenging task, due to its high-dimensional and complex inner structure. In this article, we start by revealing the structure of TTS data from afn statistical view of point. Then, in line with this analysis, we perform
- [1] . 2016. Layer normalization. arXiv:1607.06450.Google Scholar
- [2] . 2020. Adaptive graph convolutional recurrent network for traffic forecasting. In Proceedings of the 34th Conference on Neural Information Processing Systems.Google Scholar
- [3] . 2018. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv:1803.01271.Google Scholar
- [4] . 2002. Permutation entropy: A natural complexity measure for time series. Physical Review Letters 88, 17 (2002), 174102.Google ScholarCross Ref
- [5] . 2021. Graphnorm: A principled approach to accelerating graph neural network training. In Proceedings of the International Conference on Machine Learning. PMLR, 1204–1215.Google Scholar
- [6] . 2020. Spectral temporal graph neural network for multivariate time-series forecasting. In Proceedings of the 34th Conference on Neural Information Processing Systems.Google Scholar
- [7] . 2022. Learning to rotate: Quaternion transformer for complicated periodical time series forecasting. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 146–156.Google ScholarDigital Library
- [8] . 2018. Mode normalization. arXiv:1810.05466.Google Scholar
- [9] . 2021. The pulse of urban transport: Exploring the co-evolving pattern for spatio-temporal forecasting. ACM Transactions on Knowledge Discovery from Data (TKDD) 15, 6 (2021), 1–25.Google ScholarDigital Library
- [10] . 2021. St-norm: Spatial and temporal normalization for multi-variate time series forecasting. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 269–278.Google ScholarDigital Library
- [11] . 2022. A multi-view multi-task learning framework for multi-variate time series forecasting. arXiv:2109.01657.Google Scholar
- [12] . 2022. Graph convolutional adversarial networks for spatiotemporal anomaly detection. IEEE Transactions on Neural Networks and Learning Systems 33, 6 (2022), 2416–2428.Google ScholarCross Ref
- [13] . 2019. GSTNet: Global spatial-temporal network for traffic flow prediction. In IJCAI. 2286–2293.Google Scholar
- [14] . 2021. MDTP: A multi-source deep traffic prediction framework over spatio-temporal trajectory data. Proceedings of the VLDB Endowment 14, 8 (2021), 1289–1297.Google ScholarDigital Library
- [15] . 2022. Sandwich batch normalization: A drop-in replacement for feature distribution heterogeneity. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision. 2494–2504.Google ScholarCross Ref
- [16] . 2021. Hierarchical graph convolution network for traffic forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 151–159.Google ScholarCross Ref
- [17] . 2019. Deep spatial–temporal 3D convolutional neural networks for traffic data forecasting. IEEE Transactions on Intelligent Transportation Systems 20, 10 (2019), 3913–3926.Google ScholarCross Ref
- [18] . 2021. Dynamic and multi-faceted spatio-temporal deep learning for traffic speed forecasting. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 547–555.Google ScholarDigital Library
- [19] . 2021. Cross-interaction hierarchical attention networks for urban anomaly prediction. In Proceedings of the 29th International Conference on International Joint Conferences on Artificial Intelligence. 4359–4365.Google Scholar
- [20] . 2019. Mist: A multiview and multimodal spatial-temporal learning framework for citywide abnormal event forecasting. In Proceedings of the World Wide Web Conference. 717–728.Google ScholarDigital Library
- [21] . 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning. PMLR, 448–456.Google ScholarDigital Library
- [22] . 2021. Deepcrowd: A deep model for large-scale citywide crowd density and flow prediction. IEEE Transactions on Knowledge and Data Engineering 35, 1 (2021), 276–290.Google Scholar
- [23] . 2019. Deepurbanevent: A system for predicting citywide crowd dynamics at big events. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2114–2122.Google ScholarDigital Library
- [24] . 2022. Spatio-temporal meta-graph learning for traffic forecasting. Proceedings of the AAAI Conference on Artificial Intelligence 37, 7 (2022), 8078–8086.Google Scholar
- [25] . 2021. Dl-traff: Survey and benchmark of deep learning models for urban traffic prediction. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 4515–4525.Google ScholarDigital Library
- [26] . 2021. Network of tensor time series. In Proceedings of the Web Conference 2021. 2425–2437.Google ScholarDigital Library
- [27] . 2018. Modeling long-and short-term temporal patterns with deep neural networks. In Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. 95–104.Google ScholarDigital Library
- [28] . 2022. Deep spatio-temporal adaptive 3d convolutional neural networks for traffic flow prediction. ACM Transactions on Intelligent Systems and Technology (TIST) 13, 2 (2022), 1–21.Google ScholarDigital Library
- [29] . 2019. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In Proceedings of the 33rd Conference on Neural Information Processing Systems.Google Scholar
- [30] Yaguang Li, Rose Yu, Cyrus Shahabi, and Yan Liu. 2018. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In Proceedings of the International Conference on Learning Representations.Google Scholar
- [31] . 2020. Geography-aware sequential location recommendation. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2009–2019.Google ScholarDigital Library
- [32] . 2019. Urban traffic prediction from spatio-temporal data using deep meta learning. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1720–1730.Google ScholarDigital Library
- [33] . 2013. Multilinear dynamical systems for tensor time series. In Proceedings of the 27th Conference on Neural Information Processing Systems.Google Scholar
- [34] . 2022. Decoupled dynamic spatial-temporal graph neural network for traffic forecasting. Proc. VLDB Endow. 15, 11 (July 2022), 2733–2746. Google ScholarDigital Library
- [35] . 2016. Instance normalization: The missing ingredient for fast stylization. arXiv:1607.08022.Google Scholar
- [36] . 2016. WaveNet: A generative model for raw audio. In SSW.Google Scholar
- [37] . 2017. Attention is all you need. In Proceedings of the 31st Conference on Neural Information Processing Systems.Google Scholar
- [38] . 2021. Spatio-temporal-categorical graph neural networks for fine-grained multi-incident co-prediction. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 2060–2069.Google ScholarDigital Library
- [39] . 2022. Event-aware multimodal mobility nowcasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 36. 4228–4236.Google ScholarCross Ref
- [40] . 2021. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. In Proceedings of the 35th Conference on Neural Information Processing Systems.Google Scholar
- [41] . 2020. Hierarchically structured transformer networks for fine-grained spatial event forecasting. In Proceedings of the Web Conference 2020. 2320–2330.Google ScholarDigital Library
- [42] . 2018. Group normalization. In Proceedings of the European Conference on Computer Vision (ECCV). 3–19.Google ScholarDigital Library
- [43] . 2020. Connecting the dots: Multivariate time series forecasting with graph neural networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 753–763.Google ScholarDigital Library
- [44] . 2019. Graph wavenet for deep spatial-temporal graph modeling. In Proceedings of the 28th International Joint Conference on Artificial Intelligence.Google Scholar
- [45] . 2021. Space meets time: Local spacetime neural network for traffic flow forecasting. In Proceedings of the 2021 IEEE International Conference on Data Mining (ICDM). IEEE, 817–826.Google ScholarCross Ref
- [46] . 2019. Co-prediction of multiple transportation demands based on deep spatio-temporal neural network. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 305–313.Google ScholarDigital Library
- [47] Bing Yu, Haoteng Yin, and Zhanxing Zhu. 2018. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. In Proceedings of the 27th International Joint Conference on Artificial Intelligence.Google Scholar
- [48] . 2022. Ts2vec: Towards universal representation of time series. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 36. 8980–8987.Google ScholarCross Ref
- [49] . 2017. Deep spatio-temporal residual networks for citywide crowd flows prediction. In Proceedings of the 31st AAAI Conference on Artificial Intelligence.Google ScholarCross Ref
- [50] . 2020. Gman: A graph multi-attention network for traffic prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34. 1234–1241.Google ScholarCross Ref
- [51] . 2021. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 11106–11115.Google ScholarCross Ref
- [52] . 2022. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In Proceedings of the International Conference on Machine Learning. PMLR, 27268–27286.Google Scholar
- [53] . 2018. Predicting multi-step citywide passenger demands using attention-based neural networks. In Proceedings of the 11th ACM International Conference on Web Search and Data Mining. 736–744.Google ScholarDigital Library
Index Terms
- TTS-Norm: Forecasting Tensor Time Series via Multi-Way Normalization
Recommendations
Nonlocal image denoising via adaptive tensor nuclear norm minimization
Nonlocal self-similarity shows great potential in image denoising. Therefore, the denoising performance can be attained by accurately exploiting the nonlocal prior. In this paper, we model nonlocal similar patches through the multi-linear approach and ...
Robust low tubal rank tensor completion via factor tensor norm minimization
Highlights- We give the definitions of tensor double norm and tensor Frobenius/nuclear hybrid norm, and regard them as low-rank regularization penalty of tensor ...
AbstractRecent research has demonstrated that low tubal rank recovery based on tensor has received extensive attention. In this correspondence, we define tensor double nuclear norm and tensor Frobenius/nuclear hybrid norm to induce a surrogate ...
Tensor completion via multi-directional partial tensor nuclear norm with total variation regularization
AbstractThis paper addresses the tensor completion problem, whose task is to estimate missing values with limited information. However, the crux of this problem is how to reasonably represent the low-rank structure embedded in the underlying data. In this ...
Comments