Skip to main content

SVM Regularizer Models on RKHS vs. on R m

  • Conference paper
  • 2535 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7389))

Abstract

There are two types of regularizer for SVM. The most popular one is that the classification function is norm-regularized on a Reproduced Kernel Hilbert Space(RKHS), and another important model is generalized support vector machine(GSVM), in which the coefficients of the classification function is norm-regularized on a Euclidean space R m. In this paper, we analyze the difference between them on computing stability, computational complexity and the efficiency of the Newton-type algorithms. Many typical loss functions are considered. The results show that the model of GSVM has more advantages than the other model. Some experiments support our analysis.

This work is supported by NNSFC under Grant No. 61072144, 61179040, 61173089 and 11101322.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, NY (2000)

    MATH  Google Scholar 

  2. Steinwart, I., Christmann, A.: Support Vector Machines. Springer (2008)

    Google Scholar 

  3. Steinwart, I.: Sparseness of Support Vector Machines. JMLR 4, 1071–1105 (2003)

    MathSciNet  Google Scholar 

  4. Keerthi, S.S., Chapelle, O., Decoste, D.: Building Support Vector Machines with reduced classifier complexity. JMLR 7, 1493–1515 (2006)

    MathSciNet  MATH  Google Scholar 

  5. Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10(5), 1299–1319 (1998)

    Article  Google Scholar 

  6. Schölkopf, B., Herbrich, R., Smola, A.J.: A Generalized Representer Theorem. In: Helmbold, D.P., Williamson, B. (eds.) COLT 2001 and EuroCOLT 2001. LNCS (LNAI), vol. 2111, pp. 416–426. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  7. Chapelle, O.: Training a Support Vector Machine in the primal. Neural Computation 19(5), 1155–1178 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  8. Joachims, T.: Support Vector Machine (1998), http://www.cs.cornell.edu/people/tj/svmlight/

  9. Platt, J.C.: Fast training of Support Vector Machines using Sequential Minimal Optimization. In: Schölkopf, B., et al. (eds.) Advances in Kernel Method-Support Vector Learning, pp. 185–208. MIT, Cambridge (1999)

    Google Scholar 

  10. Suykens, J., Vandewalle, J.: Least Square Support Vector Machine Classifiers. Neural Processing Letters 9(3), 293–300 (1999)

    Article  MathSciNet  Google Scholar 

  11. Mangasarian, O.L.: Generalized Support Vector Machine. In: Smola, A.J., et al. (eds.) Advances in Large Margin Classifiers, pp. 135–146. MIT Press (2000)

    Google Scholar 

  12. Lee, Y., Mangasarian, O.L.: RSVM: Reduced Support Vector Machines. Data Mining Institute, Computer Sciences Department, University of Wisconsin, pp. 1–7 (2001)

    Google Scholar 

  13. Fung, G., Mangasarian, O.L.: Proximal Support Vector Machine classifiers. In: Provost, Srikant, R. (eds.) Proceedings KDD 2001: Knowledge Discovery and Data Mining, San Francisco, CA, August 26-29, pp. 77–86. ACM, New York (2001)

    Google Scholar 

  14. Boyd, S.P., Vandenberghe, L.: Convex Optimization, 7th edn. Cambridge University Press, Cambridge (2009)

    Google Scholar 

  15. Zhou, S., Liu, H., Zhou, L., Ye, F.: Semi-smooth Newton Support Vector Machine. Pattern Recognition Letters 28, 2054–2062 (2007)

    Article  Google Scholar 

  16. Lin, K.M., Lin, C.J.: A Study on Reduced Support Vector Machines. IEEE Trans. on Neural Networks 14(6), 1449–1459 (2003)

    Article  Google Scholar 

  17. Ye, F., Liu, H., Zhou, S., Liu, S.: A Smoothing Trust-region Newton-CG method for Minimax Problem. Applied Mathematics and Computation 199(2), 581–589 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  18. Golub, G.H., Loan, C.F.V.: Matrix Computations. The John Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  19. Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

  20. Rätsch, G.: Benchmark Repository (2003), http://users.rsise.anu.edu.au/~raetsch/data/index.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Dong, Y., Zhou, S. (2012). SVM Regularizer Models on RKHS vs. on R m . In: Huang, DS., Jiang, C., Bevilacqua, V., Figueroa, J.C. (eds) Intelligent Computing Technology. ICIC 2012. Lecture Notes in Computer Science, vol 7389. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31588-6_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31588-6_14

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31587-9

  • Online ISBN: 978-3-642-31588-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics