Abstract
In this paper, we proposed a framework for the Contextual Hierarchical Contrastive Learning for Time Series in Frequency Domain (CHCL-TSFD). We discuss that converting the data in the real domain to the frequency domain will result in a small amount of resonance cancellation and the optimal frequency for the smoothness of the converted time series. And then, using the instance-contrastive loss the temporal contrastive loss to analyze contextual semantic learning. Experimental results illustrate that we have achieved an average increase of 6.2% over other unsupervised learning methods on 70 URC datasets and the error rate is improved by at least 10% on MSE and by at least 5% on MAE in SOTAs on three public datasets.
This work was supported by the Science Foundation of China.
University of Petroleum, Beijing (No. 2462020YXZZ023).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Tan, C.B., Petitjean, F., Webb, G.I.: Monash University, UEA, UCR Time Series Regression Archive. arXiv (2020a)
Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. In: International Conference on Learning Representations (2021)
Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.-A.: Deep learning for time series classification: a review. Data Min. Knowl. Disc. 33(4), 917–963 (2019). https://doi.org/10.1007/s10618-019-00619-1
Yue, Z., et al.: Ts2vec: towards universal representation of time series. arXiv preprint arXiv:2106.10466 (2021)
Eldele, E., et al.: Time-series representation learning via temporal and contextual contrasting. In: International Joint Conference on Artificial Intelligence (IJCAI 2021) (2021)
Etemad, A, Sarkar, P.: Self-supervised representation learning for emotion recognition. IEEE Trans. Affect. Comput. (2020)
Saeed, A., Ozcelebi, T., Lukkien, J.: Multi-task self-supervised learning for human Activity detection. In: ACM Interactions Mobile Wearable Ubiquitous Technologies (2019)
Aggarwal, K., Joty, S.R., Fernandez-Luque, L., Srivastava, J.: Adversarial unsupervised representation learning for activity time series. In: AAAI (2019)
Oord, A., Li, Y., Vinyals O., Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748 (2018)
Mohsenvand, M.N., Izadi, M.R., Maes, P.: Contrastive representation learning for electroencephalogram classification. In: Machine Learning for Health NeurIPS Workshop (2020)
Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: ICML (2020)
Bai, S., Kolter, J.Z., Koltun, V.: An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling. CoRR, abs/1803.01271 (2018)
Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in Neural Information Processing Systems, vol. 32. Curran Associates, Inc. (2019)
Lai, G., Chang, W.-C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In The 41st International ACMSIGIR Conference on Research & Development in Information Retrieval, pp. 95–104 (2018)
Franceschi, J.-Y., Dieuleveut, A., Jaggi, M.: Unsupervised scalable representation learning for multi-variate time series. In: Advances in Neural Information Processing Systems, vol. 32. Curran Associates, Inc. (2019)
Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A., Eickhoff, C.: A transformer-based framework for multivariate time series representation learning. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, KDD 2021, pp. 2114–2124. New York, NY, USA: Association for Computing Machinery (2021). ISBN 9781450383325
Dau, H.A., et al.: The UCR time series archive. IEEE/CAA J. Autom. Sin. 6(6), 1293–1305 (2019)
Bagnall, A.J., et al.: The UEA multivariate time series classification archive (2018). CoRR,abs/1811.00075
DeBrouwer, E., Simm, J., Arany, A., Moreau, Y.: GRU-ODE-Bayes: continuous modeling of sporadically-observed time series. In: Wallach, H., et al. (eds.) Advances in Neural Information Processing Systems, vol. 32, pp. 7379–7390. Curran Associates, Inc. (2019)
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. Proc. AAAI Conf. Artif. Intell. 35(12), 11106–11115 (2021). https://doi.org/10.1609/aaai.v35i12.17325
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Wang, Mh., Liu, Jw. (2023). The Context Hierarchical Contrastive Learning for Time Series in Frequency Domain. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Communications in Computer and Information Science, vol 1791. Springer, Singapore. https://doi.org/10.1007/978-981-99-1639-9_4
Download citation
DOI: https://doi.org/10.1007/978-981-99-1639-9_4
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-1638-2
Online ISBN: 978-981-99-1639-9
eBook Packages: Computer ScienceComputer Science (R0)