Skip to main content

Geometrical Realization for Time Series Forecasting

  • Conference paper
  • First Online:
Deep Learning Theory and Applications (DeLTA 2024)

Abstract

This research primarily focuses on tackling the challenge of forecasting univariate time series. Existing methods for training forecasting models range from traditional statistical models to domain-specific algorithms, and more recently deep neural network models typically rely on raw time series observations, with alternative representations like higher-dimensional embedding used mainly for auxiliary analyses such as Topological Data Analysis (TDA). In contrast to conventional time series analysis methods, this study explores the impact of higher-dimensional embedding as a primary data representation. Leveraging this higher-dimensional embedding, we introduce a geometrical realization model that captures crucial data points of the embedding representation. Subsequently, we propose a deep neural network model inspired by N-BEATS, incorporating a TDA model, an attention model, and a convolutional neural network (CNN) model in parallel as sub-modules alongside the geometrical realization model. To assess the efficacy of the proposed model, we conduct evaluations on diverse time series datasets spanning various domains, including electricity load demands and M4 competition datasets. Furthermore, we conduct an ablation study to analyze the specific contributions of each sub-module towards the final predictions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://archive.ics.uci.edu/ml/datasets/ElectricityLoadDiagrams20112014.

  2. 2.

    https://mofc.unic.ac.cy/the-dataset/.

References

  1. Adams, H., et al.: Persistence images: a stable vector representation of persistent homology (2015). https://doi.org/10.48550/ARXIV.1507.06217, https://arxiv.org/abs/1507.06217

  2. Ba, J.L., Kiros, J.R., Hinton, G.E.: Layer normalization (2016)

    Google Scholar 

  3. Bao, W., Yue, J., Rao, Y.: A deep learning framework for financial time series using stacked autoencoders and long-short term memory. PLoS ONE 12(7), e0180944 (2017). https://doi.org/10.1371/journal.pone.0180944, http://dx.doi.org/10.1371/journal.pone.0180944

  4. Bubenik, P.: Statistical topological data analysis using persistence landscapes (2012). https://doi.org/10.48550/ARXIV.1207.6437, https://arxiv.org/abs/1207.6437

  5. Carrière, M., Chazal, F., Ike, Y., Lacombe, T., Royer, M., Umeda, Y.: PersLay: a neural network layer for persistence diagrams and new graph topological signatures (2019). https://doi.org/10.48550/ARXIV.1904.09378, https://arxiv.org/abs/1904.09378

  6. Czarnowski, J., Laidlow, T., Clark, R., Davison, A.J.: DeepFactors: real-time probabilistic dense monocular slam. IEEE Robot. Autom. Lett. 5(2), 721–728 (2020). https://doi.org/10.1109/lra.2020.2965415, http://dx.doi.org/10.1109/LRA.2020.2965415

  7. Edelsbrunner, L.: Zomorodian: Topological persistence and simplification. Discret. Comput. Geom. 28(4), 511–533 (2002). https://doi.org/10.1007/s00454-002-2885-2, http://dx.doi.org/10.1007/s00454-002-2885-2

  8. Gensler, A., Henze, J., Sick, B., Raabe, N.: Deep learning for solar power forecasting - an approach using autoencoder and LSTM neural networks. In: 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE (2016). https://doi.org/10.1109/smc.2016.7844673, http://dx.doi.org/10.1109/SMC.2016.7844673

  9. Gers, F., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. In: 1999 Ninth International Conference on Artificial Neural Networks ICANN 99 (Conference Publication No. 470), vol. 2, pp. 850–855 (1999). https://doi.org/10.1049/cp:19991218

  10. Gidea, M., Katz, Y.: Topological data analysis of financial time series: landscapes of crashes (2017)

    Google Scholar 

  11. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735, http://dx.doi.org/10.1162/neco.1997.9.8.1735

  12. Hofer, C., Kwitt, R., Dixit, M., Niethammer, M.: Connectivity-optimized representation learning via persistent homology (2019). https://doi.org/10.48550/ARXIV.1906.09003, https://arxiv.org/abs/1906.09003

  13. Jin, X., Yu, X., Wang, X., Bai, Y., Su, T., Kong, J.: Prediction for time series with CNN and LSTM. In: Wang, R., Chen, Z., Zhang, W., Zhu, Q. (eds.) Proceedings of the 11th International Conference on Modelling, Identification and Control (ICMIC2019). LNEE, vol. 582, pp. 631–641. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-0474-7_59

    Chapter  Google Scholar 

  14. Koprinska, I., Wu, D., Wang, Z.: Convolutional neural networks for energy time series forecasting. In: 2018 International Joint Conference on Neural Networks (IJCNN). IEEE (2018)

    Google Scholar 

  15. Kusano, G., Fukumizu, K., Hiraoka, Y.: Persistence weighted Gaussian kernel for topological data analysis (2016). https://doi.org/10.48550/ARXIV.1601.01741, https://arxiv.org/abs/1601.01741

  16. Loshchilov, I., Hutter, F.: Decoupled weight decay regularization (2017). https://doi.org/10.48550/ARXIV.1711.05101, https://arxiv.org/abs/1711.05101

  17. Makridakis, S., Spiliotis, E., Assimakopoulos, V.: The M4 competition: 100, 000 time series and 61 forecasting methods. Int. J. Forecast. 36(1), 54–74 (2020). https://doi.org/10.1016/j.ijforecast.2019.04.014, http://dx.doi.org/10.1016/j.ijforecast.2019.04.014

  18. Maletic, S., Zhao, Y., Rajkovic, M.: Persistent topological features of dynamical systems (2015). https://doi.org/10.48550/ARXIV.1510.06933, https://arxiv.org/abs/1510.06933

  19. Oreshkin, B.N., Carpov, D., Chapados, N., Bengio, Y.: N-BEATS: neural basis expansion analysis for interpretable time series forecasting (2019)

    Google Scholar 

  20. Perea, J.A.: Topological time series analysis (2018)

    Google Scholar 

  21. Rangapuram, S.S., Seeger, M.W., Gasthaus, J., Stella, L., Wang, Y., Januschowski, T.: Deep state space models for time series forecasting. In: Advances in Neural Information Processing Systems, vol. 31. Curran Associates, Inc. (2018). https://proceedings.neurips.cc/paper_files/paper/2018/file/5cf68969fb67aa6082363a6d4e6468e2-Paper.pdf

  22. Reininghaus, J., Huber, S., Bauer, U., Kwitt, R.: A stable multi-scale kernel for topological machine learning (2014). https://doi.org/10.48550/ARXIV.1412.6821, https://arxiv.org/abs/1412.6821

  23. Romeu, P., Zamora-Martínez, F., Botella-Rocamora, P., Pardo, J.: Stacked denoising auto-encoders for short-term time series forecasting. In: Koprinkova-Hristova, P., Mladenov, V., Kasabov, N.K. (eds.) Artificial Neural Networks. SSB, vol. 4, pp. 463–486. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-09903-3_23

    Chapter  Google Scholar 

  24. Rotman, J.J.: An Introduction to Homological Algebra. Springer, New York (2009). https://doi.org/10.1007/b98977, http://dx.doi.org/10.1007/b98977

  25. Sagheer, A., Kotb, M.: Time series forecasting of petroleum production using deep LSTM recurrent networks. Neurocomputing 323, 203–213 (2019). https://doi.org/10.1016/j.neucom.2018.09.082, http://dx.doi.org/10.1016/j.neucom.2018.09.082

  26. Salinas, D., Flunkert, V., Gasthaus, J.: DeepAR: probabilistic forecasting with autoregressive recurrent networks (2017)

    Google Scholar 

  27. Sauer, T.: Attractor reconstruction. Scholarpedia 1(10), 1727 (2006). https://doi.org/10.4249/scholarpedia.1727, http://dx.doi.org/10.4249/scholarpedia.1727

  28. Smith, L.N., Topin, N.: Super-convergence: very fast training of neural networks using large learning rates (2017). https://doi.org/10.48550/ARXIV.1708.07120, https://arxiv.org/abs/1708.07120

  29. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks (2014). https://doi.org/10.48550/ARXIV.1409.3215, https://arxiv.org/abs/1409.3215

  30. Takens, F.: Detecting strange attractors in turbulence. In: Rand, D., Young, L.S. (eds.) Dynamical Systems and Turbulence, Warwick 1980. LNM, vol. 898, pp. 366–381. Springer, Heidelberg (1981). https://doi.org/10.1007/bfb0091924

    Chapter  Google Scholar 

  31. Vaswani, A., et al.: Attention is all you need (2017)

    Google Scholar 

  32. Wen, Q., et al.: Transformers in time series: a survey (2022)

    Google Scholar 

  33. Whitney, H.: Geometric Integration Theory. Princeton University Press, Princeton (1957). https://doi.org/10.1515/9781400877577, http://dx.doi.org/10.1515/9781400877577

  34. Wibawa, A.P., Utama, A.B.P., Elmunsyah, H., Pujianto, U., Dwiyanto, F.A., Hernandez, L.: Time-series analysis with smoothed convolutional neural network. J. Big Data 9(1), 44 (2022)

    Article  Google Scholar 

  35. Williams, G.: Chaos theory tamed (1997). https://doi.org/10.1201/9781482295412, http://dx.doi.org/10.1201/9781482295412

  36. Xue, W., Zhou, T., Wen, Q., Gao, J., Ding, B., Jin, R.: Make transformer great again for time series forecasting: channel aligned robust dual transformer (2023)

    Google Scholar 

  37. Yu, H.F., Rao, N., Dhillon, I.S.: Temporal regularized matrix factorization for high-dimensional time series prediction. In: Lee, D., Sugiyama, M., Luxburg, U., Guyon, I., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 29. Curran Associates, Inc. (2016). https://proceedings.neurips.cc/paper_files/paper/2016/file/85422afb467e9456013a2a51d4dff702-Paper.pdf

  38. Zeng, A., Chen, M., Zhang, L., Xu, Q.: Are transformers effective for time series forecasting? (2022)

    Google Scholar 

  39. Zeng, S., Graf, F., Hofer, C., Kwitt, R.: Topological attention for time series forecasting. In: Ranzato, M., Beygelzimer, A., Dauphin, Y., Liang, P., Vaughan, J.W. (eds.) Advances in Neural Information Processing Systems, vol. 34, pp. 24871–24882. Curran Associates, Inc. (2021). https://proceedings.neurips.cc/paper_files/paper/2021/file/d062f3e278a1fbba2303ff5a22e8c75e-Paper.pdf

  40. Zeng, Z., Kaur, R., Siddagangappa, S., Rahimi, S., Balch, T., Veloso, M.: Financial time series forecasting using CNN and transformer (2023)

    Google Scholar 

  41. Zhang, L., et al.: A review of machine learning in building load prediction. Appl. Energy 285, 116452 (2021). https://doi.org/10.1016/j.apenergy.2021.116452, http://dx.doi.org/10.1016/j.apenergy.2021.116452

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Malek Mouhoub .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bayeh, A., Mouhoub, M., Sadaoui, S. (2024). Geometrical Realization for Time Series Forecasting. In: Fred, A., Hadjali, A., Gusikhin, O., Sansone, C. (eds) Deep Learning Theory and Applications. DeLTA 2024. Communications in Computer and Information Science, vol 2172. Springer, Cham. https://doi.org/10.1007/978-3-031-66705-3_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-66705-3_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-66704-6

  • Online ISBN: 978-3-031-66705-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics