Skip to main content

Gradient Descent for Gaussian Processes Variance Reduction

  • Conference paper
  • First Online:
Structural, Syntactic, and Statistical Pattern Recognition (S+SSPR 2018)

Abstract

A key issue in Gaussian Process modeling is to decide on the locations where measurements are going to be taken. A good set of observations will provide a better model. Current state of the art selects such a set so as to minimize the posterior variance of the Gaussian Process by exploiting submodularity. We propose a Gradient Descent procedure to iteratively improve an initial set of observations so as to minimize the posterior variance directly. The performance of the technique is analyzed under different conditions by varying the number of measurement points, the dimensionality of the domain and the hyperparameters of the Gaussian Process. Results show the applicability of the technique and the clear improvements that can be obtain under different settings.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bottarelli, L., Bicego, M., Blum, J., Farinelli, A.: Skeleton-based orienteering for level set estimation. In: 22nd European Conference on Artificial Intelligence, ECAI 2016, Including Prestigious Applications of Artificial Intelligence, The Hague, The Netherlands, 29 August–2 September 2016, pp. 1256–1264 (2016)

    Google Scholar 

  2. Bottarelli, L., Blum, J., Bicego, M., Farinelli, A.: Path efficient level set estimation for mobile sensors. In: Proceedings of the Symposium on Applied Computing SAC 2017, pp. 262–267, ACM. New York, NY, USA (2017)

    Google Scholar 

  3. Guestrin, C., Krause, A., Singh, A.P.: Near-optimal sensor placements in Gaussian processes. In: Proceedings of the 22nd International Conference on Machine Learning, pp. 265–272. ACM (2005)

    Google Scholar 

  4. Krause, A., Guestrin, C.: Near-optimal observation selection using submodular functions. In: National Conference on Artificial Intelligence (AAAI), Nectar track, July 2007

    Google Scholar 

  5. Krause, A., Guestrin, C., Gupta, A., Kleinberg, J.: Robust sensor placements at informative and communication-efficient locations. ACM Trans. Sen. Netw. 7(4), 31:1–31:33 (2011)

    Article  Google Scholar 

  6. Krause, A., McMahan, H.B., Guestrin, C., Gupta, A.: Robust submodular observation selection. J. Mach. Learn. Res. 9(Dec), 2761–2801 (2008)

    MATH  Google Scholar 

  7. Krause, A., Singh, A.: Near-optimal sensor placements in Gaussian processes: theory, efficient algorithms and empirical studies. J. Mach. Learn. Res. 9(Feb), 235–284 (2008)

    MATH  Google Scholar 

  8. La, H.M., Sheng, W.: Distributed sensor fusion for scalar field mapping using mobile sensor networks. IEEE Trans. Cybern. 43(2), 766–778 (2013)

    Article  Google Scholar 

  9. Nemhauser, G.L., Wolsey, L.A., Fisher, M.L.: An analysis of approximations for maximizing submodular set functions–I. Math. Program. 14(1), 265–294 (1978)

    Article  MathSciNet  Google Scholar 

  10. Powers, T., Bilmes, J., Krout, D.W., Atlas, L.: Constrained robust submodular sensor selection with applications to multistatic sonar arrays. In: 2016 19th International Conference on Information Fusion (FUSION), pp. 2179–2185, July 2016

    Google Scholar 

  11. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lorenzo Bottarelli .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bottarelli, L., Loog, M. (2018). Gradient Descent for Gaussian Processes Variance Reduction. In: Bai, X., Hancock, E., Ho, T., Wilson, R., Biggio, B., Robles-Kelly, A. (eds) Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2018. Lecture Notes in Computer Science(), vol 11004. Springer, Cham. https://doi.org/10.1007/978-3-319-97785-0_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-97785-0_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-97784-3

  • Online ISBN: 978-3-319-97785-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics