Skip to main content

Accelerated learning in Boltzmann Machines using mean field theory

  • Part II: Cortical Maps and Receptive Fields
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

Abstract

The learning process in Boltzmann Machines is computationally intractible. We present a new approximate learning algorithm for Boltzmann Machines, which is based on mean field theory and the linear response theorem. The computational complexity of the algorithm is cubic in the number of neurons.

In the absence of hidden units, we show how the weights can be directly computed from the fixed point equation of the learning rules. We show that the solutions of this method are close to the optimal and give a significant improvement over the naive mean field approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D. Ackley, G. Hinton, and T. Sejnowski. A learning algorithm for Boltzmann Machines. Cognitive Science, 9:147–169, 1985.

    Google Scholar 

  2. C. Itzykson and J-M. Drouffe. Statistical field theory. Cambridge monographs on mathematical physics. Cambridge University Press, Cambridge, UK, 1989.

    Google Scholar 

  3. C. Peterson and J.R. Anderson. A mean field theory learning algorithm for neural networks. Complex systems, 1:995–1019, 1987.

    Google Scholar 

  4. G.E. Hinton. Deterministic Boltzmann learning performs steepest descent in weightspace. Neural Computation, 1:143–150, 1989.

    Google Scholar 

  5. H.J. Kappen and F.B. Rodríguez. Efficient learning in Boltzmann Machines using linear response theory. Neural Computation, page Submitted, 1997.

    Google Scholar 

  6. G. Parisi. Statistical Field Theory. Frontiers in Physics. Addison-Wesley, 1988.

    Google Scholar 

  7. S. Kullback. Information theory and statistics. Wiley, New York, 1959.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kappen, H.J., Rodríguez, F.B. (1997). Accelerated learning in Boltzmann Machines using mean field theory. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020171

Download citation

  • DOI: https://doi.org/10.1007/BFb0020171

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics