Abstract
The VC dimension bound of the set of separating hyperplanes is evaluated by the ratio of squared radius of smallest sphere to squared margin. Choosing some kernel and its parameters means that the radius is fixed. In SVM with hard margin, the ratio is minimized through minimizing squared 2-norm of weight vector. In this paper, a bound for squared radius in the feature space is built, which depends on the scaling factor of RBF kernel and the squared radius bound in the input space. The squared 2-norm of weight vector is described as a quadratic form. Therefore, a simple VC dimension bound with RBF kernel is proposed for classification. Based on minimizing this bound, two constrained nonlinear programming problems are constructed for the linearly and nonlinearly separable cases. Through solving them, we can design the nonlinear classifiers with RBF kernel and determine the scaling factor of RBF kernel simultaneously.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Vapnik, V.N.: The Nature of Statistical Learning Theory, 2nd edn. Springer, New York (1999)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Bartlett, P., Shawe-Taylor, J.: Generalization Performance of Support Vector Machines and Other Pattern Classifiers. In: Scholkopf, B., Buegesm, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods. Support Vector Learning, pp. 43–54. MIT Press, Cambridge (1998)
Burges, C.J.C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)
Duan, K., Keerthi, S.S., Poo, A.N.: Evaluation of Simple Performance Measure for Tuning SVM Hyperparameters. Neurocomputing 51, 41–59 (2003)
Scholkopf, B., Burges, C., Vapnik, V.N.: Extracting Support Data for a Given Task. In: Fayyad, U.M., Uthurusamy, R. (eds.) Proceedings of the First International Conference on Knowledge Discovery and Data Mining, pp. 252–257. AAAI Press, Menlo Park (1995)
Chapelle, O., Vapnik, V.N., Bousquet, O., Mukherjee, S.: Choosing Multiple Parameters for Support Vector Machines. Machine Learning 46, 131–159 (2002)
Keerthi, S.S.: Effective Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithm. IEEE Transactions on Neural Networks 13, 1225–1229 (2002)
Tan, Y., Wang, J.: A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik- Chervonenkis Dimension. IEEE Transactions on Knowledge and Data Engineering 16, 385–395 (2004)
Chung, K.M., Kao, W.C., Sun, C.L., Wang, L.L., Lin, C.J.: Radius Margin Bounds for Support Vector Machines with the RBF Kernel. Neural Computation 15, 2643–2681 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Xu, J. (2005). Designing Nonlinear Classifiers Through Minimizing VC Dimension Bound. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_144
Download citation
DOI: https://doi.org/10.1007/11427391_144
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25912-1
Online ISBN: 978-3-540-32065-4
eBook Packages: Computer ScienceComputer Science (R0)