Skip to main content

Gaussian Process

  • Reference work entry
Encyclopedia of Machine Learning

Synonyms

Expectation propagation; Kernels; Laplace estimate; Nonparametric Bayesian

Definition

Gaussian processes generalize multivariate Gaussian distributions over finite dimensional vectors to infinite dimensionality. Specifically, a Gaussian process is a stochastic process that has Gaussian distributed finite dimensional marginal distributions, hence the name. In doing so, it defines a distribution over functions, i.e., each draw from a Gaussian process is a function. Gaussian processes provide a principled, practical, and probabilistic approach to inference and learning in kernel machines.

Motivation and Background

Bayesian probabilistic approaches have many virtues, including their ability to incorporate prior knowledge and their ability to link related sources of information. Typically, we are given a set of data points sampled from an underlying but unknown distribution, each of which includes input x and output y, such as the ones shown in Fig. 1a. The task is to learn a...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • Abrahamsen, P. (1992). A review of Gaussian random fields and correlation functions. Rapport 917, Norwegian Computing Center, Oslo. www.nr.no/publications/917_Rapport.ps.

  • Brooks, A., Makarenko, A., & Upcroft, B. (2006). Gaussian process models for sensor-centric robot localisation. In Proceedings of ICRA. IEEE.

    Google Scholar 

  • Chu, W., & Ghahramani, Z. (2005a). Gaussian processes for ordinal regression. Journal of Machine Learning Research, 6, 1019–1041.

    MathSciNet  Google Scholar 

  • Chu, W., & Ghahramani, Z. (2005b). Npreference learning with gaussian processes. In: Proceedings of the international conference on machine learning (pp. 137–144). New York: ACM.

    Google Scholar 

  • Chu, W., Ghahramani, Z., Falciani, F., & Wild, D. (2005). Biomarker discovery in microarray gene expression data with Gaussian processes. Bioinformatics, 21(16), 3385–3393.

    Article  Google Scholar 

  • Chu, W., Sindhwani, V., Ghahramani, Z., & Keerthi, S. (2006). Relational learning with gaussian processes. In Proceedings of neural information processing systems. Canada: Vancouver.

    Google Scholar 

  • Deisenroth, M. P., Rasmussen, C. E., & Peters, J. (2009). Gaussian process dynamic programming. Neurocomputing, 72(7–9), 1508–1524.

    Article  Google Scholar 

  • Engel, Y., Mannor, S., & Meir, R. (2005). Reinforcement learning with Gaussian processes. In Proceedings of the international conference on machine learning, Bonn, Germany (pp. 201–208). New York: ACM.

    Google Scholar 

  • Ferris, B., Haehnel, D., & Fox, D. (2006). Gaussian processes for signal strength-based location estimation. In Proceedings of robotics: Science and systems, Philadelphia, USA. Cambridge, MA: The MIT Press.

    Google Scholar 

  • Gao, P., Honkela, A., Rattray, M., & Lawrence, N. (2008). Gaussian process modelling of latent chemical species: applications to inferring transcription factor activities. Bioinformatics 24(16), i70–i75.

    Article  Google Scholar 

  • Guiver, J., & Snelson, E. (2008). Learning to rank with softrank and gaussian processes. In Proceedings of SIGIR. (pp. 259–266). New York: ACM.

    Google Scholar 

  • Kersting, K., & Xu, Z. (2009). Learning preferences with hidden common cause relations. In Proceedings of ECML PKDD. Berlin: Springer.

    Google Scholar 

  • Krause, A., Singh, A., & Guestrin, C. (2008). Near-optimal sensor placements in Gaussian processes: Theory, efficient algorithms and empirical studies. Journal of Machine Learning Research, 9, 235–284.

    Google Scholar 

  • Krige, D. G. (1951). A statistical approach to some basic mine valuation problems on the witwatersrand. Journal of the Chemical, Metallurgical and Mining Society of South Africa, 52(6), 119–139.

    Google Scholar 

  • Lawrence, N. (2005). Probabilistic non-linear principal component analysis with gaussian process latent variable models. Journal of Machine Learning Research, 6, 1783–1816.

    MathSciNet  Google Scholar 

  • Lawrence, N., & Urtasun, R. (2009). Non-linear matrix factorization with Gaussian processes. In Proceedings of the international conference on machine learning (pp. 601–608). New York: ACM.

    Google Scholar 

  • MacKay, D. J. C. (1992). The evidence framework applied to classification networks. Neural Computation, 4(5), 720–736.

    Article  Google Scholar 

  • Matheron, G. (1963). Principles of geostatistics. Economic Geology (58), 1246–1266.

    Google Scholar 

  • Neal, R. (1996). Bayesian learning in neural networks. New York: Springer.

    Google Scholar 

  • Nickisch, H., & Rasmussen, C. E. (2008). Approximations for binary gaussian process classification. Journal of Machine Learning Research, 9, 2035–2078.

    MathSciNet  Google Scholar 

  • Plagemann, C., Fox, D., & Burgard, W. (2007). Efficient failure detection on mobile robots using particle filters with gaussian process proposals. In Proceedings of the international joint conference on artificial intelligence (IJCAI), Hyderabad, India. Morgan Kaufmann.

    Google Scholar 

  • Plagemann, C., Kersting, K., Pfaff, P., & Burgard, W. (2007). Gaussian beam processes: A nonparametric bayesian measurement model for range finders. In Proceedings of the robotics: Science and systems conference (RSS-07), Atlanta, GA, USA. The MIT Press.

    Google Scholar 

  • John C. Platt., Christopher J. C. Burges., Steven Swenson., Christopher Weare., & Alice Zheng. (2002). Learning a gaussian process prior for automatically generating music playlists. In Advances in Neural Information Processing Systems, 1425–1432, MIT Press.

    Google Scholar 

  • Quiñonero-Candela, J., & Rasmussen, C. E. (2005). A unifying view of sparse approximate gaussian process regression. Journal of Machine Learning Research, 6, 1939–1959.

    Google Scholar 

  • Rasmussen, C. E., & Williams, C. K. I. (2006). Gaussian processes for machine learning. Cambridge, MA: MIT Press.

    MATH  Google Scholar 

  • Schwaighofer, A., Grigoras, M., Tresp, V., & Hoffmann, C. (2004). A Gaussian process positioning system for cellular networks. In Advances in neural information processing systems 16. Cambridge, MA: MIT Press.

    Google Scholar 

  • Schwaighofer, A., & Tresp, V. (2003). Transductive and inductive methods for approximate guassian process regression. In Neural information processing systems. Cambridge, MA: MIT Press.

    Google Scholar 

  • Seeger, M., Williams, C. K. I., & Lawrence, N. (2003). Fast forward selection to speed up sparse gaussian process regression. In Ninth international workshop on artificial intelligence and statistics. Society for Artificial Intelligence and Statistics.

    Google Scholar 

  • Silva, R., Chu, W., & Ghahramani, Z. (2007). Hidden common cause relations in relational learning. In Proceedings of neural information processing systems. Canada: Vancouver.

    Google Scholar 

  • Silverman, B. W. (1985). Some aspects of the spline smoothing approach to non-parametric regression curve fitting. Journal of Royal Statistical Society B, 47(1), 1–52.

    MATH  Google Scholar 

  • Snelson, E., & Ghahramani, Z. (2006). Sparse gaussian processes using pseudo-inputs. In Advanes in neural information processing systems (pp. 1257–1264). The MIT Press.

    Google Scholar 

  • Tresp, V. (2000a). A Bayesian committee machine. Neural Computation, 12(11), 2719–2741.

    Article  Google Scholar 

  • Tresp, V. (2000b). Mixtures of gaussian processes. In T. K. Leen, T. G. Dietterich, V. Tresp (Eds.), Advances in neural information processing systems 13 (pp. 654–660). The MIT Press.

    Google Scholar 

  • Williams, C., & Barber, D. (1998). Bayesian classification with Gaussian processes. IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI, 20(12), 1342–1351.

    Article  Google Scholar 

  • Williams, C., & Rasmussen, C. (1996). Gaussian processes for regression. In D. S. Touretzky, M. C. Mozer, M. E. Hasselmo (Eds.), Advances in neural information processing systems 8 (Vol. 8, pp. 514–520). Cambridge, MA: MIT Press.

    Google Scholar 

  • Xu, Z., Kersting, K., & Tresp, V. (2009). Multi-relational learning with gaussian processes. In Proceedings of the international joint conference on artificial intelligence (IJCAI). Morgan Kaufmann.

    Google Scholar 

  • Yu, K., Chu, W., Yu, S., Tresp, V., & Xu, Z. (2006). Stochastic relational models for discriminative link prediction. In Proceedings of neural information processing systems. Canada: Vancouver.

    Google Scholar 

  • Yu, K., Tresp, V., & Schwaighofer, A. (2005). Learning gaussian processes from multiple tasks. In Proceedings of the international conference on machine learning (pp. 1012–1019). New York: ACM.

    Google Scholar 

  • Yu, S., Yu, K., Tresp, V., & Kriegel, H. P. (2006). Collaborative ordinal regression. In W. Cohen, A. Moore (Eds.), Proceedings of the 23rd international conference on machine learning (pp. 1089–1096). New York: ACM.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this entry

Cite this entry

Quadrianto, N., Kersting, K., Xu, Z. (2011). Gaussian Process. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_324

Download citation

Publish with us

Policies and ethics