Skip to main content

A Practical Parameters Selection Method for SVM

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3173))

Abstract

The performance of Support Vector Machine (SVM) is significantly affected by model parameters. One commonly used parameters selection method of SVM, Grid search (GS) method, is very time consuming. Present paper introduces Uniform Design (UD) and Support Vector Regression (SVR) method to reduce the computation cost of traditional GS method: the error bounds of SVM are only computed on some nodes that are selected by UD method, then a Support Vector Regression (SVR) are trained by the computation results. Subsequently, the values of error bound of SVM on other nodes are estimated by the SVR function and the optimized parameters can be selected based on the estimated results. Experiments on seven standard datasets show that parameters selected by proposed method can result in similar test error rate as that obtained by conventional GS method, while the computation cost can be reduced at most from o( n m) to o(n), where m is the number of parameters, n is the number of levels of each parameter.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: The Nature of Statistical Learning Theory, 2nd edn. Springer, Heidelberg (1995)

    MATH  Google Scholar 

  2. Duan, K., Keerthi, S.S., Poo, A.N.: Evaluation of Simple Performance Measures for Tuning SVM Hyperparameters. Neurocomputing 51, 41–59 (2003)

    Article  Google Scholar 

  3. Chung, K.M., Kao, W.C., Sun, C.L., Wang, L.L., Lin, C.J.: Radius Margin Bounds for Support Vector Machines with the RBF Kernel. Neural Computation 15, 2643–2681 (2003)

    Article  MATH  Google Scholar 

  4. Chapelle, O., Vapnik, V.N., Bousquet, O., Mukherjee, S.: Choosing Multiple Parameters for Support Vector Machines. Machine Learning 46, 131–159 (2002)

    Article  MATH  Google Scholar 

  5. Keerthi, S.S.: Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms. IEEE Transactions on Neural Networks 13, 1225–1229 (2002)

    Article  Google Scholar 

  6. Schölkopf, B., Mika, S., Burges, C.J.C., Knirsch, P., Müller, K.R., Rätsch, G., Smola, A.J.: Input Space vs. Feature Space in Kernel-Based Methods. IEEE Transactions on Neural Networks 10, 1000–1017 (1999)

    Article  Google Scholar 

  7. Zhu, Y.S.: Support Vector Machine and Its Applications in Mechanical Fault Pattern Recognition (in Chinese). Ph.D thesis of Xi’an Jiaotong University. Xi’an, China (2003)

    Google Scholar 

  8. Fang, K.T.: The Uniform Design: Application of Number-Theoretic Methods in Experimental Design. Acta Math. Appl. Sin. 3, 363–372 (1980)

    MATH  Google Scholar 

  9. Wang, Y., Fang, K.T.: A Note on Uniform Distribution and Experimental Design. KeXue TongBao (Sci. Bull. China) 26, 485–489 (1981)

    MATH  MathSciNet  Google Scholar 

  10. Uniform Design website, http://www.math.hkbu.edu.hk/UniformDesign

  11. Cherkassky, V., Ma, Y.Q.: Practical Selection of SVM Parameters and Noise Estimation for SVM Regression. Neural Network 17, 113–126 (2004)

    Article  MATH  Google Scholar 

  12. Momma, M., Bennett, K.P.: A Pattern Search Method for Model Selection of Support Vector Regression. In: Kumar, V., Mannila, H., Motwani, R. (eds.) Proceedings of the Second Siam International Conference on Data Mining, SIAM, Philadelphia (2002)

    Google Scholar 

  13. Cherkassky, V., Ma, Y.Q.: Comparison of Model Selection for Regression. Neural Comp. 15, 1691–1714 (2003)

    Article  MATH  Google Scholar 

  14. Chapelle, O., Vapnik, V.N., Bengio, Y.: Model Selection for Small Sample Regression. Machine Learning 48, 9–23 (2002)

    Article  MATH  Google Scholar 

  15. Fang, K.T.: Uniform Design and Uniform Design Table (in Chinese), 1st edn. Science Press, Beijin (1994)

    Google Scholar 

  16. IDA Benchmark Repository, http://ida.first.gmd.de/~raetsch/data/benchmarks.htm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhu, Y., Li, C., Zhang, Y. (2004). A Practical Parameters Selection Method for SVM. In: Yin, FL., Wang, J., Guo, C. (eds) Advances in Neural Networks – ISNN 2004. ISNN 2004. Lecture Notes in Computer Science, vol 3173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28647-9_86

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-28647-9_86

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22841-7

  • Online ISBN: 978-3-540-28647-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics