Abstract
We propose a computationally efficient method for cross-validation of the Support Vector Regression (SVR) by generalizing the decremental algorithm of SVR. Incremental and decremental algorithm of Support Vector Machines (SVM) 2, 8, 9) efficiently update the trained SVM model when a single data point is added to or removed from the training set. The computational cost of leave-one-out cross-validation can be reduced using the decremental algorithm. However, when we perform leave-m-out cross-validation (m > 1), we have to repeatedly apply the decremental algorithm for each data point. In this paper, we extend the decremental algorithm of SVR8, 9) in such a way that several data points can be removed more efficiently. Experimental results indicate that the proposed approach can reduce the computational cost. In particular, we observed that the number of breakpoints, which is the main computational cost of the involved path-following, were reduced from \({\mathcal O}(m)\) to \({\mathcal O}(\sqrt{m})\).
Similar content being viewed by others
References
Bach, F. R., Heckerman, D. and Horvitz, E., “Considering Cost Asymmetry in Learning Classiffers,” Journal of Machine Learning Research, 7, pp. 1713–1741, 2006.
Cauwenberghs, G. and Poggio, T., “Incremental and Decremental Support Vector Machine Learning,” Advances in Neural Information Processing Systems 13 (Leen, T. K., Dietterich, T. G. and Tresp, V. ed.), MIT Press, pp. 409–415, 2000.
Chang, C.-C. and Lin, C.-J., LIBSVM: A Library for Support Vector Machines, 2001, Software available at http://www.csie.ntu.edu.tw/_cjlin/libsvm
Cristianini, N. and Shwe-Taylor, J., An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Cambridge University Press, 2000.
Gal, T., Postoptimal Analysis, Parametric Programming, and Related Topics, Walter de Gruyter, 1995.
Hastie, T., Rosset, S., Tibshirani, R. and Zhu, J., “The Entire Regularization Path for the Support Vector Machine,” Journal of Machine Learning Research, 5, pp. 1391–1415, 2004.
Laskov, P., Gehl, C., Kruger, S. and Muller, K., “Incremental Support Vector Learning: Analysis, Implementation and Applications,” Journal of Machine Learning Research, 7, pp. 1909–1936, 2006.
Ma, J. and Theiler, J., “Accurate Online Support Vector Regression,” Neural Computation, 15, 11, pp. 2683–2703, 2003.
Martin, M., “On-line Support Vector Machines for Function Approximation,” Tech. Rep. LSI-02-11-R, Software Department, Universitat Politecnica de Catalunya, Spain, 2002.
Schott, J. R., Matrix Analysis for Statistics, Wiley-Interscience, 2005.
Takeuchi, I., Nomura, K. and Kanamori, T., “Nonparametric Conditional Density Estimation using Piecewise-linear Solution Path of Kernel Quantile Regression,” Neural Computation, 21, 2, pp. 533–559, 2009.
Vapnik, V., The Nature of Statistical Learning Theory, Springer-Verlag, 1995.
Wang, W., Yeung, D.-Y. and Lochovsky, F. H., “A New Solution Path Algorithm in Support Vector Regression,” IEEE Transactions on Neural Networks, 19, 10, pp. 1753–1767, 2008.
Author information
Authors and Affiliations
Corresponding author
About this article
Cite this article
Karasuyama, M., Takeuchi, I. & Nakano, R. Efficient Leave-m-out Cross-Validation of Support Vector Regression by Generalizing Decremental Algorithm. New Gener. Comput. 27, 307–318 (2009). https://doi.org/10.1007/s00354-008-0067-3
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00354-008-0067-3