Skip to main content

Stochastic Complexity for Mixture of Exponential Families in Variational Bayes

  • Conference paper
Algorithmic Learning Theory (ALT 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3734))

Included in the following conference series:

Abstract

The Variational Bayesian learning, proposed as an approximation of the Bayesian learning, has provided computational tractability and good generalization performance in many applications. However, little has been done to investigate its theoretical properties.

In this paper, we discuss the Variational Bayesian learning of the mixture of exponential families and derive the asymptotic form of the stochastic complexities. We show that the stochastic complexities become smaller than those of regular statistical models, which implies the advantage of the Bayesian learning still remains in the Variational Bayesian learning. Stochastic complexity, which is called the marginal likelihood or the free energy, not only becomes important in addressing the model selection problem but also enables us to discuss the accuracy of the Variational Bayesian approach as an approximation of the true Bayesian learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Akaike, H.: Likelihood and Bayes procedure. In: Bernald, J.M. (ed.) Bayesian Statistics, pp. 143–166. University Press, Valencia (1980)

    Google Scholar 

  2. Alzer, H.: On some inequalities for the Gamma and Psi functions. Mathematics of computation 66(217), 373–389 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  3. Attias, H.: Inferring parameters and structure of latent variable models by variational bayes. In: Proc. of Uncertainty in Artificial Intelligence(UAI 1999) (1999)

    Google Scholar 

  4. Beal, M.J.: Variational algorithms for approximate bayesian inference. Ph.D. Thesis, University College London (2003)

    Google Scholar 

  5. Brown, L.D.: Fundamentals of statistical exponential families. IMS Lecture Notes-Monograph Series, vol. 9 (1986)

    Google Scholar 

  6. Ghahramani, Z., Beal, M.J.: Graphical models and variational methods. In: Saad, D., Opper, M. (eds.) Advanced Mean Field Methods – Theory and Practice. MIT Press, Cambridge (2000)

    Google Scholar 

  7. Hartigan, J.A.: A Failure of likelihood asymptotics for normal mixtures. In: Proceedings of the Berkeley Conference in Honor of J.Neyman and J. Kiefer, vol. 2, pp. 807–810 (1985)

    Google Scholar 

  8. Mackay, D.J.: Bayesian interpolation. Neural Computation 4(2), 415–447 (1992)

    Article  MathSciNet  Google Scholar 

  9. Rissanen, J.: Stochastic complexity and modeling. Annals of Statistics 14(3), 1080–1100 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  10. Sato, M.: Online model selection based on the variational bayes. Neural Computation 13(7), 1649–1681 (2004)

    Article  Google Scholar 

  11. Schwarz, G.: Estimating the dimension of a model. Annals of Statistics 6(2), 461–464 (1978)

    Article  MATH  MathSciNet  Google Scholar 

  12. Watanabe, K., Watanabe, S.: Lower bounds of stochastic complexities in variational bayes learning of gaussian mixture models. In: Proceedings of IEEE conference on Cybernetics and Intelligent Systems (CIS 2004), pp. 99–104 (2004)

    Google Scholar 

  13. Watanabe, S.: Algebraic analysis for non-identifiable learning machines. Neural Computation 13(4), 899–933 (2001)

    Article  MATH  Google Scholar 

  14. Yamazaki, K., Watanabe, S.: Singularities in mixture models and upper bounds of stochastic complexity. International Journal of Neural Networks 16, 1029–1038 (2003)

    Article  Google Scholar 

  15. Yamazaki, K., Watanabe, S.: Stochastic complexity of bayesian networks. In: Proc. of Uncertainty in Artificial Intelligence (UAI 2003) (2003)

    Google Scholar 

  16. Yamazaki, K., Watanabe, S.: Newton diagram and stochastic complexity in mixture of binomial distributions. In: Proc. of Algorithmic Learning Theory (ALT 2004), pp. 350–364 (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Watanabe, K., Watanabe, S. (2005). Stochastic Complexity for Mixture of Exponential Families in Variational Bayes. In: Jain, S., Simon, H.U., Tomita, E. (eds) Algorithmic Learning Theory. ALT 2005. Lecture Notes in Computer Science(), vol 3734. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11564089_10

Download citation

  • DOI: https://doi.org/10.1007/11564089_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-29242-5

  • Online ISBN: 978-3-540-31696-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics