Skip to main content

On the Noise Model of Support Vector Machines Regression

  • Conference paper
  • First Online:
Algorithmic Learning Theory (ALT 2000)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1968))

Included in the following conference series:

  • 504 Accesses

Abstract

Support Vector Machines Regression (SVMR) is a learning technique where the goodness of fit is measured not by the usual quadratic loss function (the mean square error), but by a different loss function called the ∈-Insensitive Loss Function (ILF), which is similar to loss functions used in the field of robust statistics. The quadratic loss function is well justified under the assumption of Gaussian additive noise. However, the noise model underlying the choice of the ILF is not clear. In this paper the use of the ILF is justified under the assumption that the noise is additive and Gaussian, where the variance and mean of the Gaussian are random variables. The probability distributions for the variance and mean will be stated explicitly. While this work is presented in the framework of SVMR, it can be extended to justify non-quadratic loss functions in any Maximum Likelihood or Maximum AP osteriori approach. It applies not only to the ILF, but to a much broader class of loss functions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. T. Evgeniou, M. Pontil, and T. Poggio. Regularization networks and support vector machines. Advances in Computational Mathematics, 13:1–50, 2000.

    Google Scholar 

  2. F. Girosi. Models of noise and robust estimates. A.I. Memo 1287, Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 1991. ftp://publications.ai.mit.edu/ai-publications/1000-1499/AIM-1287.ps.

  3. F. Girosi. An equivalence between sparse approximation and Support Vector Machines. Neural Computation, 10(6):1455–1480, 1998.

    Article  Google Scholar 

  4. F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural networks architectures. Neural Computation, 7:219–269, 1995.

    Article  Google Scholar 

  5. P.J. Huber. Robust Statistics. John Wiley and Sons, New York, 1981.

    MATH  Google Scholar 

  6. G.S. Kimeldorf and G. Wahba. Acor respondence between Bayesian estimation on stochastic processes and smoothing by splines. Ann. Math. Statist., 41(2):495–502, 1971.

    Article  MathSciNet  Google Scholar 

  7. M. Pontil, S. Mukherjee, and F. Girosi. On the noise model of support vector machine regression. A.I. Memo 1651, MIT Artificial Intelligence Lab., 1998. ftp://publications.ai.mit.edu/ai-publications/1500-1999/AIM-1651.ps.

  8. V. Vapnik. The Nature of Statistical Learning Theory. Springer, New York, 1995.

    MATH  Google Scholar 

  9. V. Vapnik, S.E. Golowich, and A. Smola. Support vector method for function approximation, regression estimation, and signal processing. In M. Mozer, M. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems 9, pages 281–287, Cambridge, MA, 1997. The MIT Press.

    Google Scholar 

  10. V. N. Vapnik. Statistical Learning Theory. Wiley, New York, 1998.

    MATH  Google Scholar 

  11. G. Wahba. Splines Models for Observational Data. Series in Applied Mathematics, Vol. 59, SIAM, Philadelphia, 1990.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pontil, M., Mukherjee, S., Girosi, F. (2000). On the Noise Model of Support Vector Machines Regression. In: Arimura, H., Jain, S., Sharma, A. (eds) Algorithmic Learning Theory. ALT 2000. Lecture Notes in Computer Science(), vol 1968. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-40992-0_24

Download citation

  • DOI: https://doi.org/10.1007/3-540-40992-0_24

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-41237-3

  • Online ISBN: 978-3-540-40992-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics