Skip to main content

Designing Nonlinear Classifiers Through Minimizing VC Dimension Bound

  • Conference paper
Advances in Neural Networks – ISNN 2005 (ISNN 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3496))

Included in the following conference series:

Abstract

The VC dimension bound of the set of separating hyperplanes is evaluated by the ratio of squared radius of smallest sphere to squared margin. Choosing some kernel and its parameters means that the radius is fixed. In SVM with hard margin, the ratio is minimized through minimizing squared 2-norm of weight vector. In this paper, a bound for squared radius in the feature space is built, which depends on the scaling factor of RBF kernel and the squared radius bound in the input space. The squared 2-norm of weight vector is described as a quadratic form. Therefore, a simple VC dimension bound with RBF kernel is proposed for classification. Based on minimizing this bound, two constrained nonlinear programming problems are constructed for the linearly and nonlinearly separable cases. Through solving them, we can design the nonlinear classifiers with RBF kernel and determine the scaling factor of RBF kernel simultaneously.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: The Nature of Statistical Learning Theory, 2nd edn. Springer, New York (1999)

    Google Scholar 

  2. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  3. Bartlett, P., Shawe-Taylor, J.: Generalization Performance of Support Vector Machines and Other Pattern Classifiers. In: Scholkopf, B., Buegesm, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods. Support Vector Learning, pp. 43–54. MIT Press, Cambridge (1998)

    Google Scholar 

  4. Burges, C.J.C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)

    Article  Google Scholar 

  5. Duan, K., Keerthi, S.S., Poo, A.N.: Evaluation of Simple Performance Measure for Tuning SVM Hyperparameters. Neurocomputing 51, 41–59 (2003)

    Article  Google Scholar 

  6. Scholkopf, B., Burges, C., Vapnik, V.N.: Extracting Support Data for a Given Task. In: Fayyad, U.M., Uthurusamy, R. (eds.) Proceedings of the First International Conference on Knowledge Discovery and Data Mining, pp. 252–257. AAAI Press, Menlo Park (1995)

    Google Scholar 

  7. Chapelle, O., Vapnik, V.N., Bousquet, O., Mukherjee, S.: Choosing Multiple Parameters for Support Vector Machines. Machine Learning 46, 131–159 (2002)

    Article  MATH  Google Scholar 

  8. Keerthi, S.S.: Effective Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithm. IEEE Transactions on Neural Networks 13, 1225–1229 (2002)

    Article  Google Scholar 

  9. Tan, Y., Wang, J.: A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik- Chervonenkis Dimension. IEEE Transactions on Knowledge and Data Engineering 16, 385–395 (2004)

    Article  Google Scholar 

  10. Chung, K.M., Kao, W.C., Sun, C.L., Wang, L.L., Lin, C.J.: Radius Margin Bounds for Support Vector Machines with the RBF Kernel. Neural Computation 15, 2643–2681 (2003)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Xu, J. (2005). Designing Nonlinear Classifiers Through Minimizing VC Dimension Bound. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_144

Download citation

  • DOI: https://doi.org/10.1007/11427391_144

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-25912-1

  • Online ISBN: 978-3-540-32065-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics