skip to main content
10.1145/3653644.3680499acmotherconferencesArticle/Chapter ViewAbstractPublication PagesfaimlConference Proceedingsconference-collections
research-article

Transformer Models for Bitcoin Price Prediction

Published: 20 September 2024 Publication History

Abstract

Time Series forecasting has been approached by a multiplicity of techniques including deep learning methods of various degrees of sophistication, showcasing notable advancements and improved performance over the past few years. More recently, there has been a sustained interest in the study of Transformers, a class of models renowned for their remarkable capacity to capture intricate long-range dependencies and interactions. This ability is perceived as particularly relevant and impactful in the context of time series modeling, reflecting a growing recognition of their potential in enhancing forecasting accuracy and understanding of complex temporal patterns. However, taking advantage of this principle to deploy successful forecasting methods is not yet clearly understood, and requires significant experimentation or engineering. Therefore, in this paper, we compare multiple variations of the Transformer model (standard Transformer, Autoformer, Informer), coupled with diverse combinations of embedding data. In particular, as the emphasis of our work is on forecasting, we investigate the relationship between Transformers’ input segment length and prediction performance in a multi-step time intervals framework. Our results suggest that the Autoformer outperforms both standard Transformer and Informer across various prediction steps. We also observe that shorter input lengths and shorter prediction lengths generally produce better model performance.

References

[1]
Stefano Cavalli and Michele Amoretti. 2021. CNN-based multivariate data analysis for bitcoin trend prediction. Applied Soft Computing 101 (2021), 107065. https://doi.org/10.1016/j.asoc.2020.107065
[2]
Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, Jakob Uszkoreit, and Neil Houlsby. 2021. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. In International Conference on Learning Representations. https://openreview.net/forum?id=YicbFdNTTy
[3]
Yuqi Nie, Nam H Nguyen, Phanwadee Sinthong, and Jayant Kalagnanam. 2023. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. In The Eleventh International Conference on Learning Representations. https://openreview.net/forum?id=Jbdc0vTOcol
[4]
Rajat Kumar Rathore, Deepti Mishra, Pawan Singh Mehra, Om Pal, Ahmad Sobri Hashim, Azrulhizam Shapi’i, T. Ciano, and Meshal Shutaywi. 2022. Real-world model for bitcoin price prediction. Information Processing and Management 59, 4 (2022), 102968. https://doi.org/10.1016/j.ipm.2022.102968
[5]
Savva Shanaev and Binam Ghimire. 2022. A generalised seasonality test and applications for cryptocurrency and stock market seasonality. The Quarterly Review of Economics and Finance 86 (2022), 172–185. https://doi.org/10.1016/j.qref.2022.07.002
[6]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Ł ukasz Kaiser, and Illia Polosukhin. 2017. Attention is All you Need. In Advances in Neural Information Processing Systems, I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett (Eds.). Vol. 30. Curran Associates, Inc.
[7]
Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, and Liang Sun. 2023. Transformers in Time Series: A Survey. In Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, Edith Elkind (Ed.). International Joint Conferences on Artificial Intelligence Organization, 6778–6786. https://doi.org/10.24963/ijcai.2023/759 Survey Track.
[8]
Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. 2021. Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. In Advances in Neural Information Processing Systems, M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, and J. Wortman Vaughan (Eds.). Vol. 34. Curran Associates, Inc., 22419–22430.
[9]
Qiantong Xu, Alexei Baevski, Tatiana Likhomanenko, Paden Tomasello, Alexis Conneau, Ronan Collobert, Gabriel Synnaeve, and Michael Auli. 2021. Self-Training and Pre-Training are Complementary for Speech Recognition. In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 3030–3034. https://doi.org/10.1109/ICASSP39728.2021.9414641
[10]
Ailing Zeng, Muxi Chen, Lei Zhang, and Qiang Xu. 2023. Are Transformers Effective for Time Series Forecasting?Proceedings of the AAAI Conference on Artificial Intelligence 37, 9 (Jun. 2023), 11121–11128. https://doi.org/10.1609/aaai.v37i9.26317
[11]
Haoyi Zhou, Shanghang Zhang, Jieqi Peng, Shuai Zhang, Jianxin Li, Hui Xiong, and Wancai Zhang. 2021. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence 35, 12 (May 2021), 11106–11115. https://doi.org/10.1609/aaai.v35i12.17325
[12]
Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, and Rong Jin. 2022. FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. In Proceedings of the 39th International Conference on Machine Learning(Proceedings of Machine Learning Research, Vol. 162), Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvari, Gang Niu, and Sivan Sabato (Eds.). PMLR, 27268–27286.
[13]
E. Zivot and J. Wang. 2007. Modeling Financial Time Series with S-PLUS®. Springer New York.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
FAIML '24: Proceedings of the 2024 3rd International Conference on Frontiers of Artificial Intelligence and Machine Learning
April 2024
379 pages
ISBN:9798400709777
DOI:10.1145/3653644
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 September 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Bitcoin
  2. Deep Learning
  3. Time Series Forecasting
  4. Transformers

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

FAIML 2024

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 48
    Total Downloads
  • Downloads (Last 12 months)48
  • Downloads (Last 6 weeks)10
Reflects downloads up to 30 Jan 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media