Skip to main content

Amortized Variational Inference via Nosé-Hoover Thermostat Hamiltonian Monte Carlo

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14447))

Included in the following conference series:

  • 641 Accesses

Abstract

Sampling latents from the posterior distribution efficiently and accurately is a fundamental problem for posterior inference. Markov chain Monte Carlo (MCMC) is such a useful tool to do that but at the cost of computational burden since it needs many transition steps to converge to the stationary distribution for each datapoint. Amortized variational inference within the framework of MCMC is thus proposed where the learned parameters of the model are shared by all observations. Langevin autoencoder is a newly proposed method that amortizes inference in parameter space. This paper generalizes the Langevin autoencoder by utilizing the stochastic gradient Nosé-Hoover Thermostat Hamiltonian Monte Carlo to conduct amortized updating of the parameters of the inference distribution. The proposed method improves variational inference accuracy for the latent by subtly dealing with the noise introduced by stochastic gradient without estimating that noise explicitly. Experiments benchmarking our method against baseline generative methods highlight the effectiveness of our proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013)

  2. Gulrajani, I., et al.: PixelVAE: a latent variable model for natural image. arXiv:1611.05013 (2016)

  3. Makhzani, A., Shlens, J., Jaitly, N., Goodfellow, I., Frey, B.: Adversarial autoencoder. arXiv:1511.05644 (2016)

  4. Higgins, I., et al.: Beta-VAE: learning basic visual concepts with a constrained variational framework. In: 5th International Conference on Learning Representations, Toulon, France (2017)

    Google Scholar 

  5. Burgess, C.P., et al.: Understanding disentangling in β-VAE. arXiv:1804.03599 (2017)

  6. Rezaabad, A.L., Vishwanath, S.: Learning representation by maximizing Mutual Information in VAE. arXiv:1912.13361 (2019)

  7. Tolstikhin, I., Bousquet, O., Gelly, S., Schoelkopf, B.: Wasserstein auto-encoders. arXiv:1711.01558 (2019)

  8. Xiao, Z., Kreis, K., Kautz, J., Vahdat, A.: VAEBM: a symbiosis between variation autoencoders and energy-based models. arXiv:2010.00654 (2021)

  9. Ganguly, A., Earp, S.W.F.: An Introduction to Variational Inference. arXiv:2108.13083 (2021)

  10. Blei, D.M., Kucukelbir, A., McAuliffe, J.D.: Variational inference: a review for statisticians. arXiv:1601.00670 (2017)

  11. Caterini, A.L., Doucet, A., Sejdinovic, D.: Hamiltonian variational auto-encoder. arXiv:1805.11328v2 (2018)

  12. Thin, A., Kotelevskii, N., Durmus, A., Panov, M., Moulines, E., Doucet, A.: Monte carlo variational auto-encoders. arXiv:2106.15921 (2021)

  13. Salimans, T., Kingma, D.P., Welling, M.: Markov chain monte carlo and variational inference: bridging the gap. arXiv:1410.6460 (2015)

  14. Wolf, C., Karl M., van der Smagt, P.: Variational Inference with Hamiltonian Monte Carlo. arXiv:1609.08203 (2016)

  15. Taniguchi, S., Iwasawa, Y., Kumagai, W., Matsuo, Y.: Langevin autoencoders for learning deep latent variable models. arXiv:2209.07036 (2022)

  16. Ding, N., Fang, Y.H., Babbush, R., Chen, C.Y., Skeel, R.D., Neven, H.: Bayesian sampling using stochastic gradient thermostats. In: Advances in Neural Information Processing Systems (2014)

    Google Scholar 

  17. Luo, R., Wang, J.H., Yang, Y.D., Zhu, Z.X., Wang, J.: Thermostat-assisted continuously-tempered Hamiltonian Monte Carlo for Bayesian learning. arXiv:1711.11511 (2018)

  18. Leimkuhler B., Reich, S.: A metropolis adjusted Nose-Hoover thermostat. ESAIM: Math. Modelling Num. Anal. 43, 743–755 (2009)

    Google Scholar 

  19. Daxberger, E.: Laplace redux-effortless Bayesian deep learning. arXiv:2106.14806 (2021)

  20. Chen, T.Q., Fox, E.B., Guestrin, C.: Stochastic Gradient Hamiltonian Monte Carlo. arXiv:1402.4102 (2014)

  21. Dockhorn, T., Vahdat, A., Kreis, K.: Score-based generative modeling with critically-damped Langevin diffusion. arXiv:2112.07068 (2022)

  22. Hoffman, M.D.: Learning deep latent gaussian models with Markov chain monte carlo. In: 34th International Conference on Machine Learning, pp.1510–1519. Sydney, NSW, Australia (2017)

    Google Scholar 

  23. Rezende, D., Mohamed, S.: Variational inference with normalizing flows. In: 32th International Conference on Machine Learning, Lille, France, pp. 1530–1538 (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhan Yuan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yuan, Z., Xu, C., Lin, Z., Zhang, Z. (2024). Amortized Variational Inference via Nosé-Hoover Thermostat Hamiltonian Monte Carlo. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Lecture Notes in Computer Science, vol 14447. Springer, Singapore. https://doi.org/10.1007/978-981-99-8079-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8079-6_7

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8078-9

  • Online ISBN: 978-981-99-8079-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics