Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4881))

  • 3180 Accesses

Abstract

Support Vector Regression (SVR) is usually pursued using the ε–insensitive loss function while, alternatively, the initial regression problem can be reduced to a properly defined classification one. In either case, slack variables have to be introduced in practical interesting problems, the usual choice being the consideration of linear penalties for them. In this work we shall discuss the solution of an SVR problem recasting it first as a classification problem and working with square penalties. Besides a general theoretical discussion, we shall also derive some consequences for regression problems of the coefficient structure of the resulting SVMs and illustrate the procedure on some standard problems widely used as benchmarks and also over a wind energy forecasting problem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bennett, K., Bredensteiner, E.: Geometry in learning. In: Gorini, C., Hart, E., Meyer, W., Phillips, T. (eds.) Geometry at Work, Mathematical Association of America (1997)

    Google Scholar 

  2. Bi, J., Bennett, K.: A geometric approach to support vector regression. Neurocomputing 55, 187–220 (2003)

    Article  Google Scholar 

  3. Burges, C., Crisp, D.: Uniqueness theorems for kernel methods. Neurocomputing 55, 187–220 (2003)

    Article  Google Scholar 

  4. Chang, C., Lin, C.: LIBSVM: a library for support vector machines, http://www.csie.ntu.edu.tw/~cjlin/LIBSVM

  5. European Centre for Medium-Range Weather Forecasts, http://www.ecmwf.int

  6. Franc, V., Hlaváč, V.: An iterative algorithm learning the maximal margin classifier. Pattern Recognition 36, 1985–1996 (2003)

    Article  MATH  Google Scholar 

  7. Prechelt, L.: Proben1 - A Set of Neural Network Benchmark Problems and Benchmarking Rules, http://digbib.ubka.uni-karlsruhe.de/eva/ira/1994/21

  8. Schölkopf, B., Smola, A., Williamson, R., Bartlett, P.: New support vector algorithms. Neural Computation 12, 1083–1121 (2000)

    Article  Google Scholar 

  9. Smola, A., Schölkopf, B.: A tutorial on support vector regression. NeuroCOLT2 Technical Report NC2-TR-1998-030 (1998)

    Google Scholar 

  10. Parque Eólico Experimental Sotavento, http://www.sotaventogalicia.com

  11. University of California Irvine: UCI-benchmark repository of machine learning data sets, http://www.ics.uci.edu

  12. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, Berlin (1995)

    MATH  Google Scholar 

  13. Vijayakumar, S., Wu, S.: Sequential Support Vector Classifiers and Regression. In: SOCO 1999. Proc. International Conference on Soft Computing, pp. 610–619 (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Hujun Yin Peter Tino Emilio Corchado Will Byrne Xin Yao

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Barbero, Á., López, J., Dorronsoro, J.R. (2007). Square Penalty Support Vector Regression. In: Yin, H., Tino, P., Corchado, E., Byrne, W., Yao, X. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2007. IDEAL 2007. Lecture Notes in Computer Science, vol 4881. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-77226-2_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-77226-2_55

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-77225-5

  • Online ISBN: 978-3-540-77226-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics