Abstract
The performance of Support Vector Machine (SVM) is significantly affected by model parameters. One commonly used parameters selection method of SVM, Grid search (GS) method, is very time consuming. Present paper introduces Uniform Design (UD) and Support Vector Regression (SVR) method to reduce the computation cost of traditional GS method: the error bounds of SVM are only computed on some nodes that are selected by UD method, then a Support Vector Regression (SVR) are trained by the computation results. Subsequently, the values of error bound of SVM on other nodes are estimated by the SVR function and the optimized parameters can be selected based on the estimated results. Experiments on seven standard datasets show that parameters selected by proposed method can result in similar test error rate as that obtained by conventional GS method, while the computation cost can be reduced at most from o( n m) to o(n), where m is the number of parameters, n is the number of levels of each parameter.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Vapnik, V.N.: The Nature of Statistical Learning Theory, 2nd edn. Springer, Heidelberg (1995)
Duan, K., Keerthi, S.S., Poo, A.N.: Evaluation of Simple Performance Measures for Tuning SVM Hyperparameters. Neurocomputing 51, 41–59 (2003)
Chung, K.M., Kao, W.C., Sun, C.L., Wang, L.L., Lin, C.J.: Radius Margin Bounds for Support Vector Machines with the RBF Kernel. Neural Computation 15, 2643–2681 (2003)
Chapelle, O., Vapnik, V.N., Bousquet, O., Mukherjee, S.: Choosing Multiple Parameters for Support Vector Machines. Machine Learning 46, 131–159 (2002)
Keerthi, S.S.: Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms. IEEE Transactions on Neural Networks 13, 1225–1229 (2002)
Schölkopf, B., Mika, S., Burges, C.J.C., Knirsch, P., Müller, K.R., Rätsch, G., Smola, A.J.: Input Space vs. Feature Space in Kernel-Based Methods. IEEE Transactions on Neural Networks 10, 1000–1017 (1999)
Zhu, Y.S.: Support Vector Machine and Its Applications in Mechanical Fault Pattern Recognition (in Chinese). Ph.D thesis of Xi’an Jiaotong University. Xi’an, China (2003)
Fang, K.T.: The Uniform Design: Application of Number-Theoretic Methods in Experimental Design. Acta Math. Appl. Sin. 3, 363–372 (1980)
Wang, Y., Fang, K.T.: A Note on Uniform Distribution and Experimental Design. KeXue TongBao (Sci. Bull. China) 26, 485–489 (1981)
Uniform Design website, http://www.math.hkbu.edu.hk/UniformDesign
Cherkassky, V., Ma, Y.Q.: Practical Selection of SVM Parameters and Noise Estimation for SVM Regression. Neural Network 17, 113–126 (2004)
Momma, M., Bennett, K.P.: A Pattern Search Method for Model Selection of Support Vector Regression. In: Kumar, V., Mannila, H., Motwani, R. (eds.) Proceedings of the Second Siam International Conference on Data Mining, SIAM, Philadelphia (2002)
Cherkassky, V., Ma, Y.Q.: Comparison of Model Selection for Regression. Neural Comp. 15, 1691–1714 (2003)
Chapelle, O., Vapnik, V.N., Bengio, Y.: Model Selection for Small Sample Regression. Machine Learning 48, 9–23 (2002)
Fang, K.T.: Uniform Design and Uniform Design Table (in Chinese), 1st edn. Science Press, Beijin (1994)
IDA Benchmark Repository, http://ida.first.gmd.de/~raetsch/data/benchmarks.htm
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhu, Y., Li, C., Zhang, Y. (2004). A Practical Parameters Selection Method for SVM. In: Yin, FL., Wang, J., Guo, C. (eds) Advances in Neural Networks – ISNN 2004. ISNN 2004. Lecture Notes in Computer Science, vol 3173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28647-9_86
Download citation
DOI: https://doi.org/10.1007/978-3-540-28647-9_86
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22841-7
Online ISBN: 978-3-540-28647-9
eBook Packages: Springer Book Archive