Skip to main content
Log in

Research on parameter selection method for support vector machines

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

The kernel parameter and penalty parameter C are the main factors that affect the learning performance of the support vector machine. However, there are many deficiencies in the existing kernel parameters and penalty parameters C. These methods do not have high accuracy when it comes to classifying multi-category samples, and even ignore some of the samples to conduct training, which violates the integrity of the experimental data. In contrast, this paper improves the selection method of support vector machine kernel parameters and penalty parameters in two ways. First, it obtains the kernel parameter value by optimizing the maximum separation interval between the samples. Second, it optimizes the generalization ability estimation via the influence of the non-boundary support vector on the stability of the support vector machine. The method takes full account of all the training sample data, which is applicable to most sample types, and has the characteristics of low initialization requirements and high-test accuracy. The paper finally uses multiple sets of UCI sample data sets and facial image recognition to verify the method. The experimental results show that the method is feasible, effective and stable.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Tanveer M, Mangal M, Ahmad I, Shao YH (2016) One norm linear programming support vector regression. Neurocomputing 173:1508–1518

    Article  Google Scholar 

  2. Khemchandani R, Pal A (2016) Multi-category laplacian least squares twin support vector machine. Appl Intell 45(2):458–474

    Article  Google Scholar 

  3. Shao Y-H et al (2013) Least squares twin parametric-margin support vector machine for classification. Appl Intell 39(3):451–464

    Article  Google Scholar 

  4. Tanveer M et al (2016) An efficient regularized k-nearest neighbor based weighted twin support vector regression. Knowl-Based Syst 94:70–87

    Article  Google Scholar 

  5. Nakagawa T, Iwahori Y, Bhuyan MK (2013) Defect classification of electronic board using multiple classifiers and grid search of SVM parameters. Computer and Information Science. Springer International Publishing, pp 115–127

  6. Cortes BC, Vapnik V (2012) Support vector networks. Int J Mach Learn 20(3):273–297

    MATH  Google Scholar 

  7. Wu K-P, Wang S-D (2009) Choosing the kernel parameters for suppory vector machines by the inter-cluster distance in the space. Pattern Recogn 42:710–717

    Article  MATH  Google Scholar 

  8. Dong C-X, Xian R, Yang S-Q et al (2004) Support vector machine (SVM) parameters selection method research. J Syst Eng Electron 26(8):1117–1120

    Google Scholar 

  9. Liu Q, Chen C, Zhang Y et al (2011) Feature selection for support vector machines with RBF kernel. Artif Intell Rev 36(2):99–115

    Article  Google Scholar 

  10. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  11. Khan NM, Ksantini R, Ahmad IS et al (2014) Covariance-guided one-class support vector machine. Pattern Recogn 47(6):2165–2177

    Article  Google Scholar 

  12. Shao YH, Chen WJ, Deng NY (2014) Nonparallel hyperplane support vector machine for binary classification problems. Inf Sci 263(3):22–35

    Article  MathSciNet  MATH  Google Scholar 

  13. Nasiri JA, Charkari NM, Jalili S (2015) Least squares twin multi-class classification support vector machine. Pattern Recogn 48(3):984–992

    Article  MATH  Google Scholar 

  14. Doran G, Ray S (2014) A theoretical and empirical analysis of support vector machine methods for multiple-instance classification. Mach Learn 97(1–2):79–102

    Article  MathSciNet  MATH  Google Scholar 

  15. Schölkopf B, Smola AJ (2002) Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT, Cambridge

    Google Scholar 

  16. Adankon MM, Cheriet M (2015) Support vector machine. Encycl Biom 1504–1511

  17. Rongali S, Yalavarthi R (2016) Parameter optimization of support vector machine by improved ant colony optimization. In: Proceedings of the second international conference on computer and communication technologies. Springer, India, pp 671–678

  18. Rakotomamonjy A, Flamary R, Yger F (2013) Learning with infinitely many features. Mach Learn 91(91):43–66

    Article  MathSciNet  MATH  Google Scholar 

  19. Fernández-Navarro F, Hervás-Martínez C, Gutiérrez PA et al (2012) Parameter estimation of q-Gaussian radial basis functions neural networks with a hybrid algorithm for binary classification. Neurocomputing 75(1):123–134

    Article  Google Scholar 

  20. Cohen DA, Fernández EA (2012) SVMTOCP: a binary tree base SVM approach through optimal multi-class Binarization. In: Iberoamerican congress on pattern recognition. Springer, Berlin, pp 472–478

  21. Abe S (2015) Fuzzy support vector machines for multilabelclassification. Pattern Recogn 48(6):2110–2117

    Article  MATH  Google Scholar 

  22. Musavi MT, Ahmed W, Chan KH et al (1992) On the training of radial basis function classifiers. Neural Netw 5(4):595–603

    Article  Google Scholar 

  23. Seetha H, Saravanan R (2011) On improving the generalization of SVM classifier. Computer Networks and Intelligent Computing. Springer, Berlin, pp 11–20

  24. Phan AV, Le Nguyen M, Bui LT (2016) Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems. Appl Intell 1–15

  25. Li Y, Tian X, Song M et al (2015) Multi-task proximal support vector machine. Pattern Recogn 48(10):3249–3257

    Article  Google Scholar 

  26. Ding S, Qi B (2012) Research of granular support vector machine. Artif Intell Rev 38(1):1–7

    Article  Google Scholar 

  27. Cheung NJ, Ding XM, Shen HB (2015) A supervised particle swarm algorithm for real-parameter optimization. Appl Intell 43(4):825–839

    Article  Google Scholar 

  28. Phan A V, Nguyen M L, Bui L T (2016) Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems. Appl Intell 1–15

  29. Sarojini B, Ramaraj N, Nickolas S (2009) Enhancing the performance of LIBSVM classifier by kernel f-score feature selection. In: International conference on contemporary computing. Springer, Berlin, pp 533–543

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ling Sun.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, L., Bao, J., Chen, Y. et al. Research on parameter selection method for support vector machines. Appl Intell 48, 331–342 (2018). https://doi.org/10.1007/s10489-017-0975-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-017-0975-3

Keywords

Navigation