Skip to main content
Log in

Lack of Consistency of Mean Field and Variational Bayes Approximations for State Space Models

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

The consistency problem of both mean field and variational Bayes estimators in the context of linear state space models is investigated. We prove that the mean field approximation is asymptotically consistent when the variances of the noise variables in the system are sufficiently small, but neither the mean field estimator nor the variational Bayes estimator is always asymptotically consistent as the ‘sample size’ becomes large. The ‘gap’ between the estimators and the true values is roughly estimated.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Attias, H.: Inferring parameters and structure of latent variable models by variational Bayes, in H. Prade and K. Laskey (eds.), Proc. 15th Conference on Uncertainty in Artificial Intelligence, pp. 21–30, Stockholm, Sweden, 1999.

    Google Scholar 

  2. Attias, H.: A variational Bayesian framework for graphical models, in S. Solla, T. Leen, and K.-R. Muller (eds), Advances in Neural Information Processing Systems 12, pp. 209–215, MIT Press: Cambridge, MA, 2000.

    Google Scholar 

  3. Beal, M. J.: Variational Algorithms for Approximate Bayesian Inference, PhD thesis, University College London, London, 2003.

    Google Scholar 

  4. Corduneanu, A. and Bishop, C. M.: Variational Bayesian model selection for mixture distributions, in T. Richardson and T. Jaakkola (eds.), Proceedings Eighth International Conference on Artificial Intelligence and Statistics, pp. 27-34, 2001.

  5. da Fonseca, C. M. and Petronilho, J.: Explicit inverses of some tridiagonal matrices, Linear Algebra Appl. 325 (2001), 7–21.

    Google Scholar 

  6. Frey, B. J. and Hinton, G. E.: Variational learning in nonlinear Gaussian belief networks, Neural Comput. 11(1) (1999), 193–213.

    Google Scholar 

  7. Ghahramani, Z. and Beal, M. J.: Propagation algorithms for variational Bayesian learning, in T. Leen, T. Dietterich, and V. Tresp (eds.), Advances in Neural Information Processing Systems 13, pp. 507–513, MIT Press: Cambridge, MA, 2001.

    Google Scholar 

  8. Hall, P., Humphreys, K. and Titterington, D. M.: On the adequacy of variational lower bound functions for likelihood-based inference in Markovian models with missing values, J. Roy. Statisti. Soc. Ser. B 64 (2002), 549–564.

    Google Scholar 

  9. Humphreys, K. and Titterington, D. M.: Approximate Bayesian inference for simple mixtures, in J. G. Bethlehem and P. G. M. van der Heijden (eds.), COMPSTAT 2000, pp. 331–336, Physica-Verlag: Heidelberg 2000.

    Google Scholar 

  10. Jaakkola, T. S. and Jordan, M. I.: Bayesian logistic regression: a variational approach, Stat. Comput. 10 (2000), 25–37.

    Google Scholar 

  11. Jordan, M. I., Ghahramani, Z., Jaakkola, T. S. and Saul, L. K.: An introduction to variational methods for graphical models, in M. I. Jordan (ed.), Learning in Graphical Models, pp. 105–162, MIT Press: Cambridge 1999.

    Google Scholar 

  12. Kappen, H. J. and Rodríguez F. B.: Efficient learning in Boltzmann machines using linear response theory, Neural Comput. 10 (1998), 1137–1156.

    Google Scholar 

  13. MacKay, D. J. C.: Ensemble learning for hidden Markov models, Technical Report, Cavendish Laboratory, University of Cambridge, 1997.

    Google Scholar 

  14. Opper, M. and Winther, O.: From naive mean field theory to the TAP equations, in M. Opper and D. Saad (eds.), Advanced Mean Field Methods, pp. 7–20, The MIT Press: Cambridge, MA, 2001.

    Google Scholar 

  15. Penny, W. D. and Roberts, S. J.: Variational Bayes for 1-dimensional mixture models, Technical Report PARG-2000-01, Oxford University, 2000.

  16. Peterson, C. and Anderson, J. R.: A mean field learning algorithm for neural networks, Complex Systems, 1(5) (1987), 995–1019.

    Google Scholar 

  17. Saul, L. K., Jaakkola, T. and Jordan, M. I.: Mean field theory of sigmoid belief networks, J. Artif. Intelli. Res., 4 (1996), 61–76.

    Google Scholar 

  18. Wang, B. and Titterington, D. M.: Local convergence of variational Bayes estimators for mixing coefficients, Technical Report 03-4, University of Glasgow. http://www.stats.gla.ac.uk/Research/TechRep2003/03-4.pdf, 2003.

  19. Whittaker, J.: Graphical Models in Applied Multivariate Statistics, John Wiley: Chichester 1990.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wang, B., Titterington, D.M. Lack of Consistency of Mean Field and Variational Bayes Approximations for State Space Models. Neural Processing Letters 20, 151–170 (2004). https://doi.org/10.1007/s11063-004-2024-6

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-004-2024-6