Skip to main content

Parallelizing a Convergent Approximate Inference Method

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 6657))

Abstract

The ability to efficiently perform probabilistic inference task is critical to large scale applications in statistics and artificial intelligence. Dramatic speedup might be achieved by appropriately mapping the current inference algorithms to the parallel framework. Parallel exact inference methods still suffer from exponential complexity in the worst case. Approximate inference methods have been parallelized and good speedup is achieved. In this paper, we focus on a variant of Belief Propagation algorithm. This variant has better convergent property and is provably convergent under certain conditions. We show that this method is amenable to coarse-grained parallelization and propose techniques to optimally parallelize it without sacrificing convergence. Experiments on a shared memory systems demonstrate that near-ideal speedup is achieved with reasonable scalability.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cannings, C., Thompson, E.A., Skolnick, M.H.: Probability functions on complex pedigrees. Advances in Applied Probability 10, 26–61 (1978)

    Article  MATH  Google Scholar 

  2. Abecasis, G.R., Cherny, S.S., Cookson, W.O., Cardon, L.R.: Merlin – rapid analysis of dense genetic maps using sparse gene flow trees. Nature Genetics 30, 97–101 (2002)

    Article  Google Scholar 

  3. Shachter, R.D., Andersen, S.K.: Global Conditioning for Probabilistic Inference in Belief Networks. In: UAI (1994)

    Google Scholar 

  4. Pennock, D.: Logarithmic Time Parallel Bayesian Inference. In: UAI, pp. 431–443 (1998)

    Google Scholar 

  5. Kozlov, A., Singh, J.: A Parallel Lauritzen-Spiegelhalter Algorithm for Probabilistic Inference. In: Proceedings of the 1994 Conference on Supercomputing, pp. 320–329 (1994)

    Google Scholar 

  6. Namasivayam, V.K., Pathak, A., Prasanna, V.K.: Scalable Parallel Implementation of Bayesian Network to Junction Tree Conversion for Exact Inference. In: 18th International Symposium on Computer Architecture and High Performance Computing (SBAC-PAD 2006), pp. 167–176 (2006)

    Google Scholar 

  7. Xia, Y., Prasanna, V.K.: Parallel exact inference on the cell broadband engine processor. In: Proceedings of the 2008 ACM/IEEE Conference on Supercomputing, pp. 1–12 (2008)

    Google Scholar 

  8. Botetz, B.: Efficient belief propagation for vision using linear constraint nodes. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (2007)

    Google Scholar 

  9. Yedidia, J.S., Freeman, W.T., Weiss, Y.: Generalized belief propagation. In: NIPS, pp. 689–695. MIT Press, Cambridge (2000)

    Google Scholar 

  10. Gonzalez, J., Low, Y., Guestrin, C., O’Hallaron, D.: Distributed Parallel Inference on Large Factor Graphs. In: UAI (2009b)

    Google Scholar 

  11. Teh, Y.W., Welling, M.: The unified propagation and scaling algorithm. In: NIPS, pp. 953–960 (2001)

    Google Scholar 

  12. Carbonetto, P., de Freitas, N., Barnard, K.: A statistical model for general contextual object recognition. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3021, pp. 350–362. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  13. Xie, Z., Gao, J., Wu, X.: Regional category parsing in undirected graphical models. Pattern Recognition Letters 30(14), 1264–1272 (2009)

    Article  Google Scholar 

  14. Su, M.: On the Convergence of Convex Relaxation Method and Distributed Optimization of Bethe Free Energy. In: Proceedings of the 11th International Symposium on Artificial Intelligence and Mathematics (ISAIM), Fort Lauderdale, Florida (2010)

    Google Scholar 

  15. Heskes, T.: Stable fixed points of loopy belief propagation are local minima of the Bethe free energy. In: NIPS, pp. 343–350 (2002)

    Google Scholar 

  16. Tomescu, I., Zimand, M.: Minimum spanning hypertrees. Discrete Applied Mathematics 54, 67–76 (1994)

    Article  MATH  Google Scholar 

  17. Dean, J., Ghemawat, S.: MapReduce: Simplified Data Processing on Large Clusters. In: Proceedings of the Sixth Symposium on Operating System Design and Implementation, San Francisco, CA (2004)

    Google Scholar 

  18. Karypis, G., Kumar, V.: hMETIS: A Hypergraph Partitioning Package (1998), http://glaros.dtc.umn.edu/gkhome/fetch/sw/hmetis/manual.pdf

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Su, M., Thompson, E. (2011). Parallelizing a Convergent Approximate Inference Method. In: Butz, C., Lingras, P. (eds) Advances in Artificial Intelligence. Canadian AI 2011. Lecture Notes in Computer Science(), vol 6657. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21043-3_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21043-3_47

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21042-6

  • Online ISBN: 978-3-642-21043-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics