Abstract
Several researchers have recently proposed alternative estimation methods of Boltzmann machines (BMs) beyond the standard maximum likelihood framework. Examples are the contrastive divergence or the ratio matching, and also a rather classic pseudolikelihood method. With a loss of statistical efficiency, alternative methods can often speed-up the computation and/or simplify the implementation. In this article, as an extreme of this direction, we show the parameter estimation of BMs can be done even with a closed-form estimator, by recasting the problem into linear regression. We confirm our estimator can actually approach the true parameter as the sample size increases, while the convergence can be slow, by a simple simulation experiment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ackley, D., Hinton, G., Sejnowski, T.: A learning algorithm for Boltzmann machines. Cognitive Science 9(1), 147–169 (1985)
Besag, J.: Statistical analysis of non-lattice data. Statistician 24, 179–195 (1975)
Harville, D.A.: Matrix Algebra From A Statistician’s Perspective. Springer, Heidelberg (1997)
Hinton, G.E.: Training products of experts by minimizing contrastive divergence. Neural Computation 14(8), 1771–1800 (2002)
Hinton, G.E., Salakhutdinov, P.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)
Hyvärinen, A.: Estimation of non-normalized statistical models by score matching. Journal of Machine Learning Research 6, 695–709 (2005)
Hyvärinen, A.: Consistency of pseudolikelihood estimation of fully visible Boltzmann machines. Neural Computation 18(10), 2283–2292 (2006)
Hyvärinen, A.: Some extensions of score matching. Computational Statistics & Data Analysis 51, 2499–2521 (2007)
Kappen, H., Rodriguez, F.: Effcient learning in Boltzmann machines using linear response theory. Neural Computation 10(5), 1137–1156 (1998)
Liang, P., Jordan, M.I.: An asymptotic analysis of generative, discriminative, and pseudolikelihood estimators. In: Proceedings of the 25th international conference on Machine learning (ICML 2008), pp. 584–591 (2008)
Lindsay, B.G.: Composite likelihood methods. Contemporary Mathematics 80, 221–239 (1988)
McCallum, A., Pal, C., Druck, G., Wang, X.: Multi-conditional learning: Generative/discriminative training for clustering and classification. In: Proceedings of the 21st National Conference on Artificial Intelligence (AAAI 2006), pp. 433–439 (2006)
Peterson, C., Anderson, J.: A mean field theory learning algorithm for neural networks. Complex Systems 58, 1486–1493 (1987)
Varin, C.: On composite marginal likelihoods. Advances in Statistical Analysis 92(1), 1–28 (2008)
Varin, C., Vidoni, P.: A note on composite likelihood inference and model selection. Biometrika 92(3), 519–528 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hirayama, Ji., Ishii, S. (2009). A Closed-Form Estimator of Fully Visible Boltzmann Machines. In: Köppen, M., Kasabov, N., Coghill, G. (eds) Advances in Neuro-Information Processing. ICONIP 2008. Lecture Notes in Computer Science, vol 5507. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03040-6_116
Download citation
DOI: https://doi.org/10.1007/978-3-642-03040-6_116
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-03039-0
Online ISBN: 978-3-642-03040-6
eBook Packages: Computer ScienceComputer Science (R0)