Skip to main content
Log in

A fast leave-one-out cross-validation for SVM-like family

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

The leave-one-out cross-validation is an important parameter selection strategy for SVM-like family, including SVM and SVR. However, due to the high computational complexity, the adaptability of this strategy is restricted. In this paper, aiming at its practical application, a fast leave-one-out cross-validation method by using an adjustment factor is proposed which focusses especially on the practicability for the SVM-like family where the decision function can be expressed by explicit dot-product of each training sample pair. The ability of the proposed method in fast parameter selection is better than that of original leave-one-out cross-validation with the same or comparable learning performance. The simulation results indicate the effectiveness and speedup of the proposed leave-one-out cross-validation method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Stone M (1974) Cross-validation choice and assessment of statistical predictions (with discussion). J R Stat Soc B 36:111–147

    MATH  Google Scholar 

  2. Rogers WH, Wagner TJ (1978) A finite sample distribution-free performance bound for local discrimination rule. Ann Stat 6:506–514

    Article  MathSciNet  MATH  Google Scholar 

  3. Shao J (1993) Linear model selection by cross-validation. J Am Stat Assoc 88:486–494

    Article  MathSciNet  MATH  Google Scholar 

  4. Stone M (1974) Cross-validatory choice and assessment of statistical predictions. J R Stat Soc B 36(1):111–147

    MathSciNet  MATH  Google Scholar 

  5. Cawley GC, Nicola LC (2003) Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers. Pattern Recogn 36:2585–2592

    Article  MATH  Google Scholar 

  6. Bousquet O, Elisseeff A (2002) Stability and generalization. J Mach Learn Res 2:499–526

    MathSciNet  MATH  Google Scholar 

  7. Devroye L, Wagner TJ (1979) Distribution-free performance bounds for potential function rules. IEEE Trans Inf Theory 25(5):601–604

    Article  MathSciNet  MATH  Google Scholar 

  8. An SJ, Liu WQ, Venkatesh S (2007) Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recogn 40(8):2154–2162

    Article  MATH  Google Scholar 

  9. Cawley GC (2006) Leave-one-out cross-validation based model selection criteria for weighted LS-SVMs. Neural networks, IJCNN’06. International joint conference on IEEE, 2006, 1661–1668

  10. Cawley GC, Talbot NLC (2004) Fast exact leave-one-out cross-validation of sparse least-squares support vector machines. Neural Netw 17(10):1467–1475

    Article  MATH  Google Scholar 

  11. Vapnik VN (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  12. Cristianini N, Shawe-Taylor J (2000) Introduction to support vector machines. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  13. Chapelle O, Vapnik V (2000) Model selection for support vector machines. In: Leen TK, Solla SA, Muller K-R (eds) Advances in neural information processing systems. MIT Press, Cambridge

    Google Scholar 

  14. Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159

    Article  MATH  Google Scholar 

  15. Chang CC, Lin CJ (2002) Training support vector regression: theory and methods. Neural Comput 14(8):1959–1977

    Article  MATH  Google Scholar 

  16. Scholkopf B, Smola A (2002) Learning with kernels. The MIT Press, Cambridge

    MATH  Google Scholar 

  17. Li G, Wen C, Li ZG, Zhang A, Yang F (2013) Model-based online learning with kernels. IEEE Trans Neural Netw Learn Syst 24(3):356–369

    Article  Google Scholar 

  18. Luntz A, Brailovsky V (1969) On estimation of characters obtained in statistical procedure of recognition (in Russian). Techniches kaya Kibernetica 3:124–136

    Google Scholar 

  19. Zhang T (2001) A leave-one-out cross-validation bound for kernel methods with applications in learning. In: 14th Annual conference on computational learning theory, 2001, pp 427–443

  20. Cucker F, Smale S (2002) On the mathematical foundations of learning. Bull Am Math Soc 39(1):1–49

    Article  MathSciNet  MATH  Google Scholar 

  21. Vapnik V, Chapelle O (2000) Bounds on error expectation for support vector machines. Neural Comput 12(9):2013–2036

    Article  Google Scholar 

  22. Theodoros E, Massimiliano P, André E (2004) Leave one out error, stability, and generalization of voting combinations of classifiers. Mach Learn 55(1):71–97

    Article  MATH  Google Scholar 

  23. Evgeniou T, Pontil M, Poggio T (2000) Regularization networks and support vector machines. Adv Comput Math 13:1–50

    Article  MathSciNet  MATH  Google Scholar 

  24. Jeon CM, Amekudzi A (2005) Addressing sustainability in transportation systems: definitions, indicators, and metrics. J Infrastruct Syst 11(1):31–50

    Article  Google Scholar 

  25. Szeto WY (2014) Dynamic modeling for intelligent transportation system applications. J Intell Transp Syst 18:323–326

    Article  Google Scholar 

Download references

Acknowledgments

This work is supported by the National Natural Science Foundation of China (Nos. 61202311, 61300151), the Natural Science Foundation of Jiangsu Province under Grant BK201221834, the Fundamental Research Funds for the Central Universities (No. JUDCF13031), and 2013 Postgraduate Students Creative Research Fund of Jiangsu Province (No. CXLX13_748), the Ministry of Education Research of Social Sciences Youth funded projects (14YJCZH206).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jingxiang Zhang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, J., Wang, S. A fast leave-one-out cross-validation for SVM-like family. Neural Comput & Applic 27, 1717–1730 (2016). https://doi.org/10.1007/s00521-015-1970-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-015-1970-4

Keywords

Navigation