Skip to main content

Data Assimilation in the Latent Space of a Convolutional Autoencoder

  • Conference paper
  • First Online:
Computational Science – ICCS 2021 (ICCS 2021)

Abstract

Data Assimilation (DA) is a Bayesian inference that combines the state of a dynamical system with real data collected by instruments at a given time. The goal of DA is to improve the accuracy of the dynamic system making its result as real as possible. One of the most popular technique for DA is the Kalman Filter (KF). When the dynamic system refers to a real world application, the representation of the state of a physical system usually leads to a big data problem. For these problems, KF results computationally too expensive and mandates to use of reduced order modeling techniques. In this paper we proposed a new methodology we called Latent Assimilation (LA). It consists in performing the KF in the latent space obtained by an Autoencoder with non-linear encoder functions and non-linear decoder functions. In the latent space, the dynamic system is represented by a surrogate model built by a Recurrent Neural Network. In particular, an Long Short Term Memory (LSTM) network is used to train a function which emulates the dynamic system in the latent space. The data from the dynamic model and the real data coming from the instruments are both processed through the Autoencoder. We apply the methodology to a real test case and we show that the LA has a good performance both in accuracy and in efficiency.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.tensorflow.org/api_docs/python/tf/keras/losses/MeanSquaredError.

  2. 2.

    https://scikit-learn.org/stable/modules/cross_validation.html.

  3. 3.

    https://www.tensorflow.org/api_docs/python/tf/keras/layers/ReLU.

References

  1. Arcucci, R., Pain, C., Guo, Y.: Effective variational data assimilation in air-pollution prediction. Big Data Min. Anal. 1(4), 297–307 (2018)

    Article  Google Scholar 

  2. Arcucci, R., Mottet, L., Pain, C., Guo, Y.: Optimal reduced space for variational data assimilation. J. Comput. Phys. 379, 51–69 (2019)

    Article  MathSciNet  Google Scholar 

  3. Arcucci, R., Zhu, J., Hu, S., Guo, Y.: Deep data assimilation: integrating deep learning with data assimilation. Appl. Sci. 11(3), 11–14 (2021). Multidisciplinary Digital Publishing Institute

    Article  Google Scholar 

  4. Asch, M., Bocquet, M., Nodet, M.: Data Assimilation: Methods, Algorithms, and Applications. SIAM, Fundamentals of Algorithms, Philadelphia (2016)

    Book  Google Scholar 

  5. Babovic, V., Keijzer, M., Bundzel, M.: From global to local modelling: a case study in error correction of deterministic models. In: Proceedings of Fourth International Conference on Hydroinformatics (2000)

    Google Scholar 

  6. Babovic, V., Caňizares, R., Jensen, H., Klinting, A.: Neural networks as routine for error updating of numerical models. J. Hydraul. Eng. 127(3), 181–193 (2001). American Society of Civil Engineers

    Article  Google Scholar 

  7. Babovic, V., Fuhrman, D.: Data assimilation of local model error forecasts in a deterministic model. Int. J. Numer. Methods Fluids 39(10), 887–918 (2002). Wiley Online Library

    Article  Google Scholar 

  8. Becker, P., Pandya, H., Gebhardt, G., Zhao, C., Taylor, J., Neumann, G.: Recurrent Kalman networks: factorized inference in high-dimensional deep feature spaces. arXiv preprint arXiv:1905.07357 (2019)

  9. Boukabara, S., Krasnopolsky, V., Stewart, J., Maddy, E., Shahroudi, N., Hoffman, R.: Leveraging modern artificial intelligence for remote sensing and NWP: benefits and challenges. Bull. Am. Meteorol. Soc. 100(12), ES473–ES491 (2019)

    Article  Google Scholar 

  10. Campos, R., Krasnopolsky, V., Alves, J., Penny, S.: Nonlinear wave ensemble averaging in the Gulf of Mexico using neural networks. J. Atmos. Oceanic Technol. 36(1), 113–127 (2019)

    Article  Google Scholar 

  11. Cintra, R., Campos Velho, H.: Data assimilation by artificial neural networks for an atmospheric general circulation model. In: Advanced Applications for Artificial Neural Networks, p. 265. BoD-Books on Demand (2018)

    Google Scholar 

  12. Dozat, T.: Incorporating nesterov momentum into adam (2016)

    Google Scholar 

  13. Dueben, P., Bauer, P.: Challenges and design choices for global weather and climate models based on machine learning. Geosci. Model Dev. 11(10), 3999–4009 (2018). Copernicus GmbH

    Article  Google Scholar 

  14. Funahashi, K., Nakamura, Y.: Approximation of dynamical systems by continuous time recurrent neural networks. Neural Netw. 6(6), 801–806 (1993). Elsevier

    Article  Google Scholar 

  15. Gagne, D., McGovern, A., Haupt, S., Sobash, R., Williams, J., Xue, M.: Storm-based probabilistic hail forecasting with machine learning applied to convection-allowing ensembles. Weather Forecast. 32(5), 1819–1840 (2017)

    Article  Google Scholar 

  16. Hannachi, A.: A primer for EOF analysis of climate data. Department of Meteorology University of Reading, pp. 1–33 (2004)

    Google Scholar 

  17. Hansen, P., Nagy, J., O’leary, D.: Deblurring Images: Matrices, Spectra, and Filtering, vol. 3. SIAM, Philadelphia (2006)

    Book  Google Scholar 

  18. Kalman, R.: A new approach to linear filtering and prediction problems. J. Basic Eng. 82(1), 35–45 (1960). American Society of Mechanical Engineers

    Article  MathSciNet  Google Scholar 

  19. Loh, K., Omrani, P.S., van der Linden, R.: Deep learning and data assimilation for real-time production prediction in natural gas wells. arXiv preprint arXiv:1802.05141 (2018)

  20. Rasp, S., Dueben, P., Scher, S., Weyn, J., Mouatadid, S., Thuerey, N.: WeatherBench: a benchmark dataset for data-driven weather forecasting. arXiv preprint arXiv:2002.00469 (2020)

  21. Schäfer, A.M., Zimmermann, H.G.: Recurrent neural networks are universal approximators. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 632–640. Springer, Heidelberg (2006). https://doi.org/10.1007/11840817_66

    Chapter  Google Scholar 

  22. Song, J., et al.: Natural ventilation in cities: the implications of fluid mechanics. Build. Res. Inform. 46(8), 809–828 (2018). Taylor & Francis

    Article  Google Scholar 

  23. Watter, M., Springenberg, J., Boedecker, J., Riedmiller, M.: Embed to control: a locally linear latent dynamics model for control from raw images. In: Advances in Neural Information Processing Systems, pp. 2746–2754 (2015)

    Google Scholar 

  24. Wikle, C., Berliner, M.: A Bayesian tutorial for data assimilation. Phys. D Nonlinear Phenom. 230(1–2), 1–16 (2007). Elsevier

    Article  MathSciNet  Google Scholar 

  25. Zhu, J., Hu, S., Arcucci, R., Xu, C., Zhu, J., Guo, Y.: Model error correction in data assimilation by integrating neural networks. Big Data Min. Anal. 2(2), 83–91 (2019). TUP

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the EPSRC Grand Challenge grant Managing Air for Green Inner Cities (MAGIC) EP/N010221/1, the EP/T003189/1 Health assessment across biological length scales for personal pollution exposure and its mitigation (INHALE), the EP/T000414/1 PREdictive Modelling with QuantIfication of UncERtainty for MultiphasE Systems (PREMIERE) and the Leonardo Centre for Sustainable Business at Imperial College London.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rossella Arcucci .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Amendola, M. et al. (2021). Data Assimilation in the Latent Space of a Convolutional Autoencoder. In: Paszynski, M., Kranzlmüller, D., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M. (eds) Computational Science – ICCS 2021. ICCS 2021. Lecture Notes in Computer Science(), vol 12746. Springer, Cham. https://doi.org/10.1007/978-3-030-77977-1_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-77977-1_30

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-77976-4

  • Online ISBN: 978-3-030-77977-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics