Abstract
This research primarily focuses on tackling the challenge of forecasting univariate time series. Existing methods for training forecasting models range from traditional statistical models to domain-specific algorithms, and more recently deep neural network models typically rely on raw time series observations, with alternative representations like higher-dimensional embedding used mainly for auxiliary analyses such as Topological Data Analysis (TDA). In contrast to conventional time series analysis methods, this study explores the impact of higher-dimensional embedding as a primary data representation. Leveraging this higher-dimensional embedding, we introduce a geometrical realization model that captures crucial data points of the embedding representation. Subsequently, we propose a deep neural network model inspired by N-BEATS, incorporating a TDA model, an attention model, and a convolutional neural network (CNN) model in parallel as sub-modules alongside the geometrical realization model. To assess the efficacy of the proposed model, we conduct evaluations on diverse time series datasets spanning various domains, including electricity load demands and M4 competition datasets. Furthermore, we conduct an ablation study to analyze the specific contributions of each sub-module towards the final predictions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Adams, H., et al.: Persistence images: a stable vector representation of persistent homology (2015). https://doi.org/10.48550/ARXIV.1507.06217, https://arxiv.org/abs/1507.06217
Ba, J.L., Kiros, J.R., Hinton, G.E.: Layer normalization (2016)
Bao, W., Yue, J., Rao, Y.: A deep learning framework for financial time series using stacked autoencoders and long-short term memory. PLoS ONE 12(7), e0180944 (2017). https://doi.org/10.1371/journal.pone.0180944, http://dx.doi.org/10.1371/journal.pone.0180944
Bubenik, P.: Statistical topological data analysis using persistence landscapes (2012). https://doi.org/10.48550/ARXIV.1207.6437, https://arxiv.org/abs/1207.6437
Carrière, M., Chazal, F., Ike, Y., Lacombe, T., Royer, M., Umeda, Y.: PersLay: a neural network layer for persistence diagrams and new graph topological signatures (2019). https://doi.org/10.48550/ARXIV.1904.09378, https://arxiv.org/abs/1904.09378
Czarnowski, J., Laidlow, T., Clark, R., Davison, A.J.: DeepFactors: real-time probabilistic dense monocular slam. IEEE Robot. Autom. Lett. 5(2), 721–728 (2020). https://doi.org/10.1109/lra.2020.2965415, http://dx.doi.org/10.1109/LRA.2020.2965415
Edelsbrunner, L.: Zomorodian: Topological persistence and simplification. Discret. Comput. Geom. 28(4), 511–533 (2002). https://doi.org/10.1007/s00454-002-2885-2, http://dx.doi.org/10.1007/s00454-002-2885-2
Gensler, A., Henze, J., Sick, B., Raabe, N.: Deep learning for solar power forecasting - an approach using autoencoder and LSTM neural networks. In: 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE (2016). https://doi.org/10.1109/smc.2016.7844673, http://dx.doi.org/10.1109/SMC.2016.7844673
Gers, F., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. In: 1999 Ninth International Conference on Artificial Neural Networks ICANN 99 (Conference Publication No. 470), vol. 2, pp. 850–855 (1999). https://doi.org/10.1049/cp:19991218
Gidea, M., Katz, Y.: Topological data analysis of financial time series: landscapes of crashes (2017)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735, http://dx.doi.org/10.1162/neco.1997.9.8.1735
Hofer, C., Kwitt, R., Dixit, M., Niethammer, M.: Connectivity-optimized representation learning via persistent homology (2019). https://doi.org/10.48550/ARXIV.1906.09003, https://arxiv.org/abs/1906.09003
Jin, X., Yu, X., Wang, X., Bai, Y., Su, T., Kong, J.: Prediction for time series with CNN and LSTM. In: Wang, R., Chen, Z., Zhang, W., Zhu, Q. (eds.) Proceedings of the 11th International Conference on Modelling, Identification and Control (ICMIC2019). LNEE, vol. 582, pp. 631–641. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-0474-7_59
Koprinska, I., Wu, D., Wang, Z.: Convolutional neural networks for energy time series forecasting. In: 2018 International Joint Conference on Neural Networks (IJCNN). IEEE (2018)
Kusano, G., Fukumizu, K., Hiraoka, Y.: Persistence weighted Gaussian kernel for topological data analysis (2016). https://doi.org/10.48550/ARXIV.1601.01741, https://arxiv.org/abs/1601.01741
Loshchilov, I., Hutter, F.: Decoupled weight decay regularization (2017). https://doi.org/10.48550/ARXIV.1711.05101, https://arxiv.org/abs/1711.05101
Makridakis, S., Spiliotis, E., Assimakopoulos, V.: The M4 competition: 100, 000 time series and 61 forecasting methods. Int. J. Forecast. 36(1), 54–74 (2020). https://doi.org/10.1016/j.ijforecast.2019.04.014, http://dx.doi.org/10.1016/j.ijforecast.2019.04.014
Maletic, S., Zhao, Y., Rajkovic, M.: Persistent topological features of dynamical systems (2015). https://doi.org/10.48550/ARXIV.1510.06933, https://arxiv.org/abs/1510.06933
Oreshkin, B.N., Carpov, D., Chapados, N., Bengio, Y.: N-BEATS: neural basis expansion analysis for interpretable time series forecasting (2019)
Perea, J.A.: Topological time series analysis (2018)
Rangapuram, S.S., Seeger, M.W., Gasthaus, J., Stella, L., Wang, Y., Januschowski, T.: Deep state space models for time series forecasting. In: Advances in Neural Information Processing Systems, vol. 31. Curran Associates, Inc. (2018). https://proceedings.neurips.cc/paper_files/paper/2018/file/5cf68969fb67aa6082363a6d4e6468e2-Paper.pdf
Reininghaus, J., Huber, S., Bauer, U., Kwitt, R.: A stable multi-scale kernel for topological machine learning (2014). https://doi.org/10.48550/ARXIV.1412.6821, https://arxiv.org/abs/1412.6821
Romeu, P., Zamora-Martínez, F., Botella-Rocamora, P., Pardo, J.: Stacked denoising auto-encoders for short-term time series forecasting. In: Koprinkova-Hristova, P., Mladenov, V., Kasabov, N.K. (eds.) Artificial Neural Networks. SSB, vol. 4, pp. 463–486. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-09903-3_23
Rotman, J.J.: An Introduction to Homological Algebra. Springer, New York (2009). https://doi.org/10.1007/b98977, http://dx.doi.org/10.1007/b98977
Sagheer, A., Kotb, M.: Time series forecasting of petroleum production using deep LSTM recurrent networks. Neurocomputing 323, 203–213 (2019). https://doi.org/10.1016/j.neucom.2018.09.082, http://dx.doi.org/10.1016/j.neucom.2018.09.082
Salinas, D., Flunkert, V., Gasthaus, J.: DeepAR: probabilistic forecasting with autoregressive recurrent networks (2017)
Sauer, T.: Attractor reconstruction. Scholarpedia 1(10), 1727 (2006). https://doi.org/10.4249/scholarpedia.1727, http://dx.doi.org/10.4249/scholarpedia.1727
Smith, L.N., Topin, N.: Super-convergence: very fast training of neural networks using large learning rates (2017). https://doi.org/10.48550/ARXIV.1708.07120, https://arxiv.org/abs/1708.07120
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks (2014). https://doi.org/10.48550/ARXIV.1409.3215, https://arxiv.org/abs/1409.3215
Takens, F.: Detecting strange attractors in turbulence. In: Rand, D., Young, L.S. (eds.) Dynamical Systems and Turbulence, Warwick 1980. LNM, vol. 898, pp. 366–381. Springer, Heidelberg (1981). https://doi.org/10.1007/bfb0091924
Vaswani, A., et al.: Attention is all you need (2017)
Wen, Q., et al.: Transformers in time series: a survey (2022)
Whitney, H.: Geometric Integration Theory. Princeton University Press, Princeton (1957). https://doi.org/10.1515/9781400877577, http://dx.doi.org/10.1515/9781400877577
Wibawa, A.P., Utama, A.B.P., Elmunsyah, H., Pujianto, U., Dwiyanto, F.A., Hernandez, L.: Time-series analysis with smoothed convolutional neural network. J. Big Data 9(1), 44 (2022)
Williams, G.: Chaos theory tamed (1997). https://doi.org/10.1201/9781482295412, http://dx.doi.org/10.1201/9781482295412
Xue, W., Zhou, T., Wen, Q., Gao, J., Ding, B., Jin, R.: Make transformer great again for time series forecasting: channel aligned robust dual transformer (2023)
Yu, H.F., Rao, N., Dhillon, I.S.: Temporal regularized matrix factorization for high-dimensional time series prediction. In: Lee, D., Sugiyama, M., Luxburg, U., Guyon, I., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 29. Curran Associates, Inc. (2016). https://proceedings.neurips.cc/paper_files/paper/2016/file/85422afb467e9456013a2a51d4dff702-Paper.pdf
Zeng, A., Chen, M., Zhang, L., Xu, Q.: Are transformers effective for time series forecasting? (2022)
Zeng, S., Graf, F., Hofer, C., Kwitt, R.: Topological attention for time series forecasting. In: Ranzato, M., Beygelzimer, A., Dauphin, Y., Liang, P., Vaughan, J.W. (eds.) Advances in Neural Information Processing Systems, vol. 34, pp. 24871–24882. Curran Associates, Inc. (2021). https://proceedings.neurips.cc/paper_files/paper/2021/file/d062f3e278a1fbba2303ff5a22e8c75e-Paper.pdf
Zeng, Z., Kaur, R., Siddagangappa, S., Rahimi, S., Balch, T., Veloso, M.: Financial time series forecasting using CNN and transformer (2023)
Zhang, L., et al.: A review of machine learning in building load prediction. Appl. Energy 285, 116452 (2021). https://doi.org/10.1016/j.apenergy.2021.116452, http://dx.doi.org/10.1016/j.apenergy.2021.116452
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Bayeh, A., Mouhoub, M., Sadaoui, S. (2024). Geometrical Realization for Time Series Forecasting. In: Fred, A., Hadjali, A., Gusikhin, O., Sansone, C. (eds) Deep Learning Theory and Applications. DeLTA 2024. Communications in Computer and Information Science, vol 2172. Springer, Cham. https://doi.org/10.1007/978-3-031-66705-3_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-66705-3_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-66704-6
Online ISBN: 978-3-031-66705-3
eBook Packages: Computer ScienceComputer Science (R0)