Abstract
The leave-one-out cross-validation is an important parameter selection strategy for SVM-like family, including SVM and SVR. However, due to the high computational complexity, the adaptability of this strategy is restricted. In this paper, aiming at its practical application, a fast leave-one-out cross-validation method by using an adjustment factor is proposed which focusses especially on the practicability for the SVM-like family where the decision function can be expressed by explicit dot-product of each training sample pair. The ability of the proposed method in fast parameter selection is better than that of original leave-one-out cross-validation with the same or comparable learning performance. The simulation results indicate the effectiveness and speedup of the proposed leave-one-out cross-validation method.






Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Stone M (1974) Cross-validation choice and assessment of statistical predictions (with discussion). J R Stat Soc B 36:111–147
Rogers WH, Wagner TJ (1978) A finite sample distribution-free performance bound for local discrimination rule. Ann Stat 6:506–514
Shao J (1993) Linear model selection by cross-validation. J Am Stat Assoc 88:486–494
Stone M (1974) Cross-validatory choice and assessment of statistical predictions. J R Stat Soc B 36(1):111–147
Cawley GC, Nicola LC (2003) Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers. Pattern Recogn 36:2585–2592
Bousquet O, Elisseeff A (2002) Stability and generalization. J Mach Learn Res 2:499–526
Devroye L, Wagner TJ (1979) Distribution-free performance bounds for potential function rules. IEEE Trans Inf Theory 25(5):601–604
An SJ, Liu WQ, Venkatesh S (2007) Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recogn 40(8):2154–2162
Cawley GC (2006) Leave-one-out cross-validation based model selection criteria for weighted LS-SVMs. Neural networks, IJCNN’06. International joint conference on IEEE, 2006, 1661–1668
Cawley GC, Talbot NLC (2004) Fast exact leave-one-out cross-validation of sparse least-squares support vector machines. Neural Netw 17(10):1467–1475
Vapnik VN (1998) Statistical learning theory. Wiley, New York
Cristianini N, Shawe-Taylor J (2000) Introduction to support vector machines. Cambridge University Press, Cambridge
Chapelle O, Vapnik V (2000) Model selection for support vector machines. In: Leen TK, Solla SA, Muller K-R (eds) Advances in neural information processing systems. MIT Press, Cambridge
Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159
Chang CC, Lin CJ (2002) Training support vector regression: theory and methods. Neural Comput 14(8):1959–1977
Scholkopf B, Smola A (2002) Learning with kernels. The MIT Press, Cambridge
Li G, Wen C, Li ZG, Zhang A, Yang F (2013) Model-based online learning with kernels. IEEE Trans Neural Netw Learn Syst 24(3):356–369
Luntz A, Brailovsky V (1969) On estimation of characters obtained in statistical procedure of recognition (in Russian). Techniches kaya Kibernetica 3:124–136
Zhang T (2001) A leave-one-out cross-validation bound for kernel methods with applications in learning. In: 14th Annual conference on computational learning theory, 2001, pp 427–443
Cucker F, Smale S (2002) On the mathematical foundations of learning. Bull Am Math Soc 39(1):1–49
Vapnik V, Chapelle O (2000) Bounds on error expectation for support vector machines. Neural Comput 12(9):2013–2036
Theodoros E, Massimiliano P, André E (2004) Leave one out error, stability, and generalization of voting combinations of classifiers. Mach Learn 55(1):71–97
Evgeniou T, Pontil M, Poggio T (2000) Regularization networks and support vector machines. Adv Comput Math 13:1–50
Jeon CM, Amekudzi A (2005) Addressing sustainability in transportation systems: definitions, indicators, and metrics. J Infrastruct Syst 11(1):31–50
Szeto WY (2014) Dynamic modeling for intelligent transportation system applications. J Intell Transp Syst 18:323–326
Acknowledgments
This work is supported by the National Natural Science Foundation of China (Nos. 61202311, 61300151), the Natural Science Foundation of Jiangsu Province under Grant BK201221834, the Fundamental Research Funds for the Central Universities (No. JUDCF13031), and 2013 Postgraduate Students Creative Research Fund of Jiangsu Province (No. CXLX13_748), the Ministry of Education Research of Social Sciences Youth funded projects (14YJCZH206).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zhang, J., Wang, S. A fast leave-one-out cross-validation for SVM-like family. Neural Comput & Applic 27, 1717–1730 (2016). https://doi.org/10.1007/s00521-015-1970-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-015-1970-4