Skip to main content

Design of Exchange Monte Carlo Method for Bayesian Learning in Normal Mixture Models

  • Conference paper
Advances in Neuro-Information Processing (ICONIP 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5506))

Included in the following conference series:

Abstract

The exchange Monte Carlo (EMC) method was proposed as an improved algorithm of Markov chain Monte Carlo method, and its effectiveness has been shown in spin-glass simulation, Bayesian learning and many other applications. In this paper, we propose a new algorithm of EMC method with Gibbs sampler by using the hidden variable representing the component from which the datum is generated, and show its effectiveness by the simulation of Bayesian learning of normal mixture models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hukushima, K., Nemoto, K.: Exchange Monte Carlo Method and Application to Spin Glass Simulation. Journal of the Physical Society of Japan 65(6), 1604–1608 (1996)

    Article  Google Scholar 

  2. Nagata, K., Watanabe, S.: Exchange Monte Carlo Sampling from Bayesian Posterior for Singular Learning Machines. IEEE Transactions on Neural Networks 19(7), 1253–1266 (2008)

    Article  Google Scholar 

  3. Robert, C.P., Casella, G.: Monte Carlo Statistical Methods, 2nd edn. Springer, New York (2004)

    Book  MATH  Google Scholar 

  4. Watanabe, S.: Algebraic analysis for nonidentifiable learning machines. Neural Computation 13(4), 899–933 (2001)

    Article  MATH  Google Scholar 

  5. Yamazaki, K., Watanabe, S.: Singularities in mixture models and upper bounds of stochastic complexity. Neural Networks 16(7), 1029–1038 (2003)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Nagata, K., Watanabe, S. (2009). Design of Exchange Monte Carlo Method for Bayesian Learning in Normal Mixture Models. In: Köppen, M., Kasabov, N., Coghill, G. (eds) Advances in Neuro-Information Processing. ICONIP 2008. Lecture Notes in Computer Science, vol 5506. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02490-0_85

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02490-0_85

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02489-4

  • Online ISBN: 978-3-642-02490-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics