Skip to main content

Batch Bayesian Quadrature with Batch Updating Using Future Uncertainty Sampling

  • Conference paper
  • First Online:
Machine Learning, Optimization, and Data Science (LOD 2022)

Abstract

We consider the approximation of unknown or intractable integrals using quadrature when the evaluation of the integrand is considered very costly. This is a central problem both within and without machine learning, including model averaging, (hyper-)parameter marginalization, and computing posterior predictive distributions. Recently, Batch Bayesian Quadrature has successfully combined the probabilistic integration techniques of Bayesian Quadrature with the parallelization techniques of Batch Bayesian Optimization, resulting in improved performance when compared to state-of-the-art Markov Chain Monte Carlo techniques, especially when parallelization is increased. While the selection of batches in Batch Bayesian Quadrature mitigates costs associated with individual point selection, every point within every batch is nevertheless chosen serially, which impedes the realization of the full potential of batch selection. We resolve this shortcoming. We have developed a novel Batch Bayesian Quadrature method that allows for the updating of points within a batch without incurring the costs traditionally associated with non-serial point selection. We show that our method efficiently reduces uncertainty, leads to lower error estimates of the integrand, and therefore results in more numerically robust estimates of the integral. We demonstrate our method and support our findings using a synthetic test function from the Batch Bayesian Quadrature literature.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Chen, M., Shao, Q., Ibrahim, J.: Monte Carol Methods in Bayesian Computation. Springer, Heidelberg (2000)

    Google Scholar 

  2. Hennig, P., Osborne, M., Girolami, M.: Probabilistic numerics and uncertainty in computations. Proc. Roy. Soc. A: Math. Phys. Eng. Sci. 471(2179) (2015). arXiv:1506.01326

  3. Neal, R.: Probabilistic inference using Markov Chain Monte Carlo methods. Technical Report CRG-TR-93–1, University of Toronto (1993)

    Google Scholar 

  4. Blei, D., Kucukelbir, A., McAuliffe, J.: Variational inference: a review for statisticians. J. Am. Stat. Assoc. 112(518), 859–877 (2016). arXiv:1601.00670

  5. O'Hagan, A.: Monte Carlo is fundamentally unsound. J. Roy. Stat. Soc. Series D (The Stat.) 36(2), 247–249 (1987)

    Google Scholar 

  6. Wagstaff, E., Hamid, S., Osborne, M.: Batch Selection for Parallelization of Bayesian Quadrature. arXiv: 1812.01553v1 (2018)

    Google Scholar 

  7. O’Hagan, A.: Bayes-Hermite quadrature. J. Stat. Plan. Inference 29, 245–260 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  8. Kennedy, M.: Bayesian Quadrature with non-normal approximating functions. Stat. Comput. 8(4), 365–375 (1998)

    Article  Google Scholar 

  9. Huszar, F., Duvenaud, D.: Optimally-weighted herding in bayesian quadrature. In: From Proceedings of the Twenty-Eight Conference on Uncertainty in Artificial Intelligence. AUAI Press, Arlington (2012)

    Google Scholar 

  10. Osborne, M., et al.: Bayesian quadrature for ratios. In: Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics (AISTATS 2012) (2012)

    Google Scholar 

  11. Gunter, T., Osborne, M., Garnett, R., Hennig, P., Roberts, S.: Sampling for inference in probabilistic models with fast bayesian quadrature. In Advances in neural information processing systems (nips), pp. 1–9 (2014). arXiv: 1411.0439v1

    Google Scholar 

  12. Chai, H., Garnett, R.: An Improved Bayesian Framework for Quadrature. arXiv:1802.04782 (2018)

  13. Osborne, M., Duvenaud, D., Garnett, R., Rasmussen, C., Roberts, S., Ghahramani, Z.: Active learning of model evidence using bayesian quadrature. Adv. Neural. Inf. Process. Syst. 1, 46–54 (2012)

    Google Scholar 

  14. Garnett, R., Krishnamurthy, Y., Xiong, X., Schneider, J., Mann, R.: Bayesian optimal active search and surveying. In: Langford, J., Pineau, J. (eds.) Proceedings of the 29th International Conference on Machine Learning (ICML 2012), Omnipress, Madison, WI, USA (2012)

    Google Scholar 

  15. Nguyen, V., Rana, S., Gupta, S., Li, C., Venkatesh, S.: Budgeted batch bayesian optimization with unknown batch sizes. In IEEE International Conference on Data Mining, ICDM, pp. 1107–1112. arXiv:1703.04842 (2017)

  16. Neal, R.: Annealed importance Sampling. Stat. Comput. 11(2), 125–139 (2001)

    Article  MathSciNet  Google Scholar 

  17. Skilling, J.: Nested Sampling. Bayesian Inference Max. Entropy Methods Sci. Eng. 735, 395–405 (2004)

    Article  MathSciNet  Google Scholar 

  18. Diaconis, P.: Bayesian numerical analysis. In: Statistical Decision Theory and Related Topics IV, pp. 163–175. Springer, New York (1988)

    Google Scholar 

  19. Minka, T.: Deriving quadrature Rules from Gaussian processes. Technical report, Statistics Department, Carnegie Mellon University, pp. 1–21 (2000)

    Google Scholar 

  20. Rasmussen, C.E., Ghahramani, Z., Becker, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems, vol. 15. MIT Press, Cambridge (2003)

    Google Scholar 

  21. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

  22. Briol, F., Oates, C., Girolami, M., Osborne, M., Sejdinovic, D.: Probabilistic Integration: A Role in Statistical Computation?, pp. 1–49. arXiv:1512.00933 (2015)

  23. Ginsbourger, D., Le Riche, R., Carraro, L.: Kriging is well-suited to parallelize optimization. In: Tenne, Y., Goh, C.-K. (eds.) Computational Intelligence in Expensive Optimization Problems. ALO, vol. 2, pp. 131–162. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-10701-6_6

    Chapter  Google Scholar 

  24. Gonzales, J., Dai, Z., Hennig, P., Lawrence, N.: Batch Bayesian optimization via local penalization. In: Artificial intelligence and statistics, pp. 648–657 (2016)

    Google Scholar 

  25. Smalenberger, K., Smalenberger, M.: On the cessation criteria for batch bayesian quadrature using future uncertainty sampling. The University of North Carolina at Charlotte (2022)

    Google Scholar 

  26. Garnett, R., Osborne, M., Reece, S., Rogers, A., Roberts, S.: Sequential Bayesian prediction in the presence of changepoints and faults. Comput. J. 53(9), 1430 (2010)

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank Dr. Xingjie “Helen” Li, Dr. Hae-Soo Oh, Dr. Duan Chen, and Dr. Milind Khire for your generous insights and support. As always, K.H.S., J.M.S., E.M.S., and W.J.S. thank you and I l. y.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kelly Smalenberger .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Smalenberger, K., Smalenberger, M. (2023). Batch Bayesian Quadrature with Batch Updating Using Future Uncertainty Sampling. In: Nicosia, G., et al. Machine Learning, Optimization, and Data Science. LOD 2022. Lecture Notes in Computer Science, vol 13810. Springer, Cham. https://doi.org/10.1007/978-3-031-25599-1_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-25599-1_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-25598-4

  • Online ISBN: 978-3-031-25599-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics