Abstract
Variational Bayesian learning is proposed for approximation method of Bayesian learning. In spite of efficiency and experimental good performance, their mathematical property has not yet been clarified. In this paper we analyze variational Bayesian Stochastic Context Free Grammar which includes the true distribution thus the model is non-identifiable. We derive their asymptotic free energy. It is shown that in some prior conditions, the free energy is much smaller than identifiable models and satisfies eliminating redundant non-terminals.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Attias, H.: Inferring parameters and structure of latent variable models by variational Bayes. In: Proc. 15th Conference on Uncertainty in Artificial Intelligence, pp. 21–20 (1999)
Beal, M.J.: Variational Algorithms for Approximate Bayesian Inference, PhD thesis, University College London (2003)
Gassiat, E., Boucheron, S.: Optimal error exponents in hidden Markov models order estimation. IEEE Transactions on Information Theory 49(2), 964–980 (2003)
Hosino, T., Watanabe, K., Watanabe, S.: Stochastic Complexity of Variational Bayesian Hidden Markov Models. In: International Joint Conference on Neural Networks (2005)
Kurihara, K., Sato, T.: An Application of the Variational Bayesian Approach to Probabilistic Context-Free Grammars. In: International Joint Conference on Natural Language Processing (2004)
Ito, H., Amari, S.-I., Kobayashi, K.: Identifiability of hidden Markov information sources and their minimum degrees of freedom. IEEE Transactions on Information Theory 38(2), 324–333 (1992)
Lari, K., Young, S.: The estimation of stochastic context-free grammars using the inside-outside algorithm. Computer Speech and Language 4, 33–56 (1990)
Nakajima, S., Watanabe, S.: Generalization Error and Free Energy of Linear Neural Networks in Variational Bayes Approach. In: The 12th International Conference on Neural Information Processing (2005)
Schwarz, G.: Estimating the dimension of a model. Annals of Statistics 6(2), 461–464 (1978)
Watanabe, S.: Algebraic analysis for non-identifiable learning machines. Neural Computation 13(4), 899–933 (2001)
Watanabe, K., Watanabe, S.: Variational bayesian stochastic complexity of mixture models. In: Advances in Neural Information Processing Systems 18. MIT Press, Cambridge (2006) (to appear)
Yamazaki, K., Watanabe, S.: Generalization Errors in Estimating of Stochastic Context-Free Grammar. Artificial Intelligence and Soft Computing (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hosino, T., Watanabe, K., Watanabe, S. (2006). Free Energy of Stochastic Context Free Grammar on Variational Bayes. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4232. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893028_46
Download citation
DOI: https://doi.org/10.1007/11893028_46
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46479-2
Online ISBN: 978-3-540-46480-8
eBook Packages: Computer ScienceComputer Science (R0)