Skip to main content

The Context Hierarchical Contrastive Learning for Time Series in Frequency Domain

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2022)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1791))

Included in the following conference series:

  • 855 Accesses

Abstract

In this paper, we proposed a framework for the Contextual Hierarchical Contrastive Learning for Time Series in Frequency Domain (CHCL-TSFD). We discuss that converting the data in the real domain to the frequency domain will result in a small amount of resonance cancellation and the optimal frequency for the smoothness of the converted time series. And then, using the instance-contrastive loss the temporal contrastive loss to analyze contextual semantic learning. Experimental results illustrate that we have achieved an average increase of 6.2% over other unsupervised learning methods on 70 URC datasets and the error rate is improved by at least 10% on MSE and by at least 5% on MAE in SOTAs on three public datasets.

This work was supported by the Science Foundation of China.

University of Petroleum, Beijing (No. 2462020YXZZ023).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Tan, C.B., Petitjean, F., Webb, G.I.: Monash University, UEA, UCR Time Series Regression Archive. arXiv (2020a)

    Google Scholar 

  2. Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. In: International Conference on Learning Representations (2021)

    Google Scholar 

  3. Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.-A.: Deep learning for time series classification: a review. Data Min. Knowl. Disc. 33(4), 917–963 (2019). https://doi.org/10.1007/s10618-019-00619-1

    Article  MathSciNet  MATH  Google Scholar 

  4. Yue, Z., et al.: Ts2vec: towards universal representation of time series. arXiv preprint arXiv:2106.10466 (2021)

  5. Eldele, E., et al.: Time-series representation learning via temporal and contextual contrasting. In: International Joint Conference on Artificial Intelligence (IJCAI 2021) (2021)

    Google Scholar 

  6. Etemad, A, Sarkar, P.: Self-supervised representation learning for emotion recognition. IEEE Trans. Affect. Comput. (2020)

    Google Scholar 

  7. Saeed, A., Ozcelebi, T., Lukkien, J.: Multi-task self-supervised learning for human Activity detection. In: ACM Interactions Mobile Wearable Ubiquitous Technologies (2019)

    Google Scholar 

  8. Aggarwal, K., Joty, S.R., Fernandez-Luque, L., Srivastava, J.: Adversarial unsupervised representation learning for activity time series. In: AAAI (2019)

    Google Scholar 

  9. Oord, A., Li, Y., Vinyals O., Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748 (2018)

  10. Mohsenvand, M.N., Izadi, M.R., Maes, P.: Contrastive representation learning for electroencephalogram classification. In: Machine Learning for Health NeurIPS Workshop (2020)

    Google Scholar 

  11. Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: ICML (2020)

    Google Scholar 

  12. Bai, S., Kolter, J.Z., Koltun, V.: An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling. CoRR, abs/1803.01271 (2018)

    Google Scholar 

  13. Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in Neural Information Processing Systems, vol. 32. Curran Associates, Inc. (2019)

    Google Scholar 

  14. Lai, G., Chang, W.-C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In The 41st International ACMSIGIR Conference on Research & Development in Information Retrieval, pp. 95–104 (2018)

    Google Scholar 

  15. Franceschi, J.-Y., Dieuleveut, A., Jaggi, M.: Unsupervised scalable representation learning for multi-variate time series. In: Advances in Neural Information Processing Systems, vol. 32. Curran Associates, Inc. (2019)

    Google Scholar 

  16. Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A., Eickhoff, C.: A transformer-based framework for multivariate time series representation learning. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, KDD 2021, pp. 2114–2124. New York, NY, USA: Association for Computing Machinery (2021). ISBN 9781450383325

    Google Scholar 

  17. Dau, H.A., et al.: The UCR time series archive. IEEE/CAA J. Autom. Sin. 6(6), 1293–1305 (2019)

    Article  Google Scholar 

  18. Bagnall, A.J., et al.: The UEA multivariate time series classification archive (2018). CoRR,abs/1811.00075

    Google Scholar 

  19. DeBrouwer, E., Simm, J., Arany, A., Moreau, Y.: GRU-ODE-Bayes: continuous modeling of sporadically-observed time series. In: Wallach, H., et al. (eds.) Advances in Neural Information Processing Systems, vol. 32, pp. 7379–7390. Curran Associates, Inc. (2019)

    Google Scholar 

  20. Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. Proc. AAAI Conf. Artif. Intell. 35(12), 11106–11115 (2021). https://doi.org/10.1609/aaai.v35i12.17325

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jian-wei Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, Mh., Liu, Jw. (2023). The Context Hierarchical Contrastive Learning for Time Series in Frequency Domain. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Communications in Computer and Information Science, vol 1791. Springer, Singapore. https://doi.org/10.1007/978-981-99-1639-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-1639-9_4

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-1638-2

  • Online ISBN: 978-981-99-1639-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics