Skip to main content

Functional Latent Dynamics for Irregularly Sampled Time Series Forecasting

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases. Research Track (ECML PKDD 2024)

Abstract

Irregularly sampled time series with missing values are often observed in multiple real-world applications such as healthcare, climate and astronomy. They pose a significant challenge to standard deep learning models that operate only on fully observed and regularly sampled time series. In order to capture the continuous dynamics of the irregular time series, many models rely on solving an Ordinary Differential Equation (ODE) in the hidden state. These ODE-based models tend to perform slow and require large memory due to sequential operations and a complex ODE solver. As an alternative to complex ODE-based models, we propose a family of models called Functional Latent Dynamics (FLD). Instead of solving the ODE, we use simple curves which exist at all time points to specify the continuous latent state in the model. The coefficients of these curves are learned only from the observed values in the time series ignoring the missing values. Through extensive experiments, we demonstrate that FLD achieves better performance compared to the best ODE-based model while reducing the runtime and memory overhead. Specifically, FLD requires an order of magnitude less time to infer the forecasts compared to the best performing forecasting model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Biloš, M., Sommer, J., Rangapuram, S.S., Januschowski, T., Günnemann, S.: Neural flows: efficient alternative to neural odes. Adv. Neural. Inf. Process. Syst. 34, 21325–21337 (2021)

    Google Scholar 

  2. Chen, R.T.Q., Rubanova, Y., Bettencourt, J., Duvenaud, D.K.: Neural ordinary differential equations. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 31. Curran Associates, Inc. (2018). https://proceedings.neurips.cc/paper_files/paper/2018/file/69386f6bb1dfed68692a24c8686939b9-Paper.pdf

  3. De Brouwer, E., Simm, J., Arany, A., Moreau, Y.: Gru-ode-bayes: continuous modeling of sporadically-observed time series. Advances in neural information processing systems 32 (2019)

    Google Scholar 

  4. Goodwin, B.C.: Oscillatory behavior in enzymatic control processes. Adv. Enzyme Regul. 3, 425–437 (1965)

    Article  Google Scholar 

  5. Johnson, A., et al.: Mimic-iii, a freely accessible critical care database sci. Data 3(160035), 10–1038 (2016)

    Google Scholar 

  6. Johnson, A., Bulgarelli, L., Pollard, T., Celi, L.A., Mark, R., Horng IV, S.: Mimic-iv-ed. PhysioNet (2021)

    Google Scholar 

  7. Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  8. Lloyd, C.M., Lawson, J.R., Hunter, P.J., Nielsen, P.F.: The cellml model repository. Bioinformatics 24(18), 2122–2123 (2008)

    Article  Google Scholar 

  9. Menne, M.J., Williams, C., Jr., Vose, R.S.: United states historical climatology network daily temperature, precipitation, and snow data. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, Oak Ridge, Tennessee (2015)

    Google Scholar 

  10. Nie, Y., Nguyen, N.H., Sinthong, P., Kalagnanam, J.: A time series is worth 64 words: long-term forecasting with transformers. arXiv preprint arXiv:2211.14730 (2022)

  11. Schirmer, M., Eltayeb, M., Lessmann, S., Rudolph, M.: Modeling irregular time series with continuous recurrent units. In: International Conference on Machine Learning, pp. 19388–19405. PMLR (2022)

    Google Scholar 

  12. Scholz, R., Born, S., Duong-Trung, N., Cruz-Bournazou, M.N., Schmidt-Thieme, L.: Latent linear ODEs with neural Kalman filtering for irregular time series forecasting (2023). https://openreview.net/forum?id=a-bD9-0ycs0

  13. Shukla, S.N., Marlin, B.M.: Multi-time attention networks for irregularly sampled time series. arXiv preprint arXiv:2101.10318 (2021)

  14. Silva, I., Moody, G., Scott, D.J., Celi, L.A., Mark, R.G.: Predicting in-hospital mortality of icu patients: the physionet/computing in cardiology challenge 2012. In: 2012 Computing in Cardiology, pp. 245–248. IEEE (2012)

    Google Scholar 

  15. Tarasiou, M., Chavez, E., Zafeiriou, S.: Vits for sits: vision transformers for satellite image time series. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10418–10428 (2023)

    Google Scholar 

  16. Vaswani, A., et al.: Attention is all you need. Advances in neural information processing systems 30 (2017)

    Google Scholar 

  17. Yalavarthi, V.K., et al.: Forecasting irregularly sampled time series using graphs (2023)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Klötergens .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Klötergens, C., Yalavarthi, V.K., Stubbemann, M., Schmidt-Thieme, L. (2024). Functional Latent Dynamics for Irregularly Sampled Time Series Forecasting. In: Bifet, A., Davis, J., Krilavičius, T., Kull, M., Ntoutsi, E., Žliobaitė, I. (eds) Machine Learning and Knowledge Discovery in Databases. Research Track. ECML PKDD 2024. Lecture Notes in Computer Science(), vol 14944. Springer, Cham. https://doi.org/10.1007/978-3-031-70359-1_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-70359-1_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-70358-4

  • Online ISBN: 978-3-031-70359-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics