skip to main content
10.1145/3664647.3681701acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Boundary-Aware Periodicity-based Sparsification Strategy for Ultra-Long Time Series Forecasting

Published: 28 October 2024 Publication History

Abstract

In various domains such as transportation, resource management, and weather forecasting, there is an urgent need for methods that can provide predictions over a sufficiently long time horizon to encompass the period required for decision-making and implementation. Compared to traditional time series forecasting, ultra-long time series forecasting requires enhancing the model's ability to infer long time series, while maintaining inference costs within an acceptable range. To address this challenge, we propose the Boundary-Aware Periodicity-based sparsification strategy for Ultra-Long time series forecasting (BAP-UL).This method effectively captures periodic features in time series and reorganizes inputs and outputs into shorter sub-sequences for improved prediction accuracy. In the paper, we investigate several commonly used benchmark datasets and demonstrate that the proposed method can yield comparable performance across them.

References

[1]
Ben Agro, Quinlan Sykora, Sergio Casas, and Raquel Urtasun. 2023. Implicit Occupancy Flow Fields for Perception and Prediction in Self-Driving. In CVPR. IEEE, 1379--1388.
[2]
Sahara Ali and Jianwu Wang. 2022. MT-IceNet - A Spatial and Multi-Temporal Deep Learning Model for Arctic Sea Ice Forecasting. In BDCAT. IEEE, 1--10.
[3]
Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2015. Neural Machine Translation by Jointly Learning to Align and Translate. In ICLR.
[4]
Shaojie Bai, J. Zico Kolter, and Vladlen Koltun. 2018. An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling. CoRR, Vol. abs/1803.01271.
[5]
Tien-Cuong Bui, Van-Duc Le, and Sang-Kyun Cha. 2018. A Deep Learning Approach for Air Pollution Forecasting in South Korea Using Encoder-Decoder Networks & LS™. CoRR, Vol. abs/1804.07891 (2018).
[6]
Cristian Challu, Kin G. Olivares, Boris N. Oreshkin, Federico Garza, Max Mergenthaler, and Artur Dubrawski. 2022. N-HiTS: Neural Hierarchical Interpolation for Time Series Forecasting. CoRR, Vol. abs/2201.12886.
[7]
Junyoung Chung, cCaglar Gülccehre, KyungHyun Cho, and Yoshua Bengio. 2014. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. CoRR, Vol. abs/1412.3555 (2014).
[8]
Abhimanyu Das, Weihao Kong, Andrew Leach, Shaan Mathur, Rajat Sen, and Rose Yu. 2023. Long-term Forecasting with TiDE: Time-series Dense Encoder. Trans. Mach. Learn. Res., Vol. 2023 (2023).
[9]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the first USENIX workshop on Offensive Technologies (WOOT '07). USENIX Association, Berkley, CA, Article 7, 9 pages.
[10]
Valentin Flunkert, David Salinas, and Jan Gasthaus. 2017. DeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks. CoRR, Vol. abs/1704.04110 (2017).
[11]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep Residual Learning for Image Recognition. In CVPR. IEEE Computer Society, 770--778.
[12]
Sepp Hochreiter and J. Schmidhuber. 1997. Long Short-Term Memory. Neural computation, Vol. 9, 8 (1997), 1735--1780.
[13]
Nikita Kitaev, Lukasz Kaiser, and Anselm Levskaya. 2020. Reformer: The Efficient Transformer. In ICLR. OpenReview.net.
[14]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. 2017. ImageNet classification with deep convolutional neural networks. Commun. ACM, Vol. 60, 6, 84--90.
[15]
Guokun Lai, Wei-Cheng Chang, Yiming Yang, and Hanxiao Liu. 2018. Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks. In SIGIR. ACM, 95--104.
[16]
Julian Lemmel, Zahra Babaiee, Marvin Kleinlehner, Ivan Majic, Philipp Neubauer, Johannes Scholz, Radu Grosu, and Sophie A. Neubauer. 2022. Deep-Learning vs Regression: Prediction of Tourism Flow with Limited Data. CoRR, Vol. abs/2206.13274 (2022).
[17]
Shiyang Li, Xiaoyong Jin, Yao Xuan, Xiyou Zhou, Wenhu Chen, Yu-Xiang Wang, and Xifeng Yan. 2019. Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting. In NeurIPS. 5244--5254.
[18]
Yanhong Li, Jack Xu, and David C. Anastasiu. 2024. Learning from Polar Representation: An Extreme-Adaptive Model for Long-Term Time Series Forecasting. In AAAI. AAAI Press, 171--179.
[19]
Yong Liu, Tengge Hu, Haoran Zhang, Haixu Wu, Shiyu Wang, Lintao Ma, and Mingsheng Long. 2024. iTransformer: Inverted Transformers Are Effective for Time Series Forecasting. In ICLR. OpenReview.net.
[20]
Yong Liu, Chenyu Li, Jianmin Wang, and Mingsheng Long. 2023. Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors. In NeurIPS.
[21]
Xiang Ma, Xuemei Li, Lexin Fang, Tianlong Zhao, and Caiming Zhang. 2024. U-Mixer: An Unet-Mixer Architecture with Stationarity Correction for Time Series Forecasting. In AAAI. AAAI Press, 14255--14262.
[22]
Danielle C. Maddix, Yuyang Wang, and Alex Smola. 2018. Deep Factors with Gaussian Processes for Forecasting. CoRR, Vol. abs/1812.00098 (2018).
[23]
Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, and Jayant Kalagnanam. 2023. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. In ICLR. OpenReview.net.
[24]
Fynn Oldenburg, Qiwei Han, and Maximilian Kaiser. 2023. Interpretable Deep Learning for Forecasting Online Advertising Costs: Insights from the Competitive Bidding Landscape. CoRR, Vol. abs/2302.05762.
[25]
Boris N. Oreshkin, Dmitri Carpov, Nicolas Chapados, and Yoshua Bengio. 2020. N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In ICLR. OpenReview.net.
[26]
Yao Qin, Dongjin Song, Haifeng Chen, Wei Cheng, Guofei Jiang, and Garrison W. Cottrell. 2017. A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction. In IJCAI. ijcai.org, 2627--2633.
[27]
Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever, et al. 2018. Improving language understanding by generative pre-training. OpenAI.
[28]
Syama Sundar Rangapuram, Matthias W. Seeger, Jan Gasthaus, Lorenzo Stella, Yuyang Wang, and Tim Januschowski. 2018. Deep State Space Models for Time Series Forecasting. In NeurIPS. 7796--7805.
[29]
Rajat Sen, Hsiang-Fu Yu, and Inderjit S. Dhillon. 2019. Think Globally, Act Locally: A Deep Neural Network Approach to High-Dimensional Time Series Forecasting. In NeurIPS. 4838--4847.
[30]
Huan Song, Deepta Rajan, Jayaraman J. Thiagarajan, and Andreas Spanias. 2018. Attend and Diagnose: Clinical Time Series Analysis Using Attention Models. In AAAI. AAAI Press, 4091--4098.
[31]
Huiqiang Wang, Jian Peng, Feihu Huang, Jince Wang, Junhui Chen, and Yifei Xiao. 2023. MICN: Multi-scale Local and Global Context Modeling for Long-term Series Forecasting. In ICLR. OpenReview.net.
[32]
Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, and Liang Sun. 2023. Transformers in Time Series: A Survey. In IJCAI. ijcai.org, 6778--6786.
[33]
Ruofeng Wen, Kari Torkkola, Balakrishnan Narayanaswamy, and Dhruv Madeka. 2017. A Multi-Horizon Quantile Recurrent Forecaster. In NeurIPS.
[34]
Haixu Wu, Tengge Hu, Yong Liu, Hang Zhou, Jianmin Wang, and Mingsheng Long. 2023. TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis. In ICLR. OpenReview.net.
[35]
Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. 2021. Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. In NeurIPS. 22419--22430.
[36]
Ailing Zeng, Muxi Chen, Lei Zhang, and Qiang Xu. 2023. Are Transformers Effective for Time Series Forecasting?. In AAAI. AAAI Press, 11121--11128.
[37]
Yunhao Zhang and Junchi Yan. 2023. Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting. In ICLR. OpenReview.net.
[38]
Haoyi Zhou, Shanghang Zhang, Jieqi Peng, Shuai Zhang, Jianxin Li, Hui Xiong, and Wancai Zhang. 2021. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. In AAAI. AAAI Press, 11106--11115.
[39]
Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, and Rong Jin. 2022. FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. In ICML (Proceedings of Machine Learning Research, Vol. 162). PMLR, 27268--27286.

Index Terms

  1. Boundary-Aware Periodicity-based Sparsification Strategy for Ultra-Long Time Series Forecasting

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MM '24: Proceedings of the 32nd ACM International Conference on Multimedia
    October 2024
    11719 pages
    ISBN:9798400706868
    DOI:10.1145/3664647
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 October 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. boundary
    2. periodic
    3. sparsification strategy
    4. ultra-long time series

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    MM '24
    Sponsor:
    MM '24: The 32nd ACM International Conference on Multimedia
    October 28 - November 1, 2024
    Melbourne VIC, Australia

    Acceptance Rates

    MM '24 Paper Acceptance Rate 1,150 of 4,385 submissions, 26%;
    Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 113
      Total Downloads
    • Downloads (Last 12 months)113
    • Downloads (Last 6 weeks)64
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media