Skip to main content
Log in

Efficient Leave-m-out Cross-Validation of Support Vector Regression by Generalizing Decremental Algorithm

  • Published:
New Generation Computing Aims and scope Submit manuscript

Abstract

We propose a computationally efficient method for cross-validation of the Support Vector Regression (SVR) by generalizing the decremental algorithm of SVR. Incremental and decremental algorithm of Support Vector Machines (SVM) 2, 8, 9) efficiently update the trained SVM model when a single data point is added to or removed from the training set. The computational cost of leave-one-out cross-validation can be reduced using the decremental algorithm. However, when we perform leave-m-out cross-validation (m > 1), we have to repeatedly apply the decremental algorithm for each data point. In this paper, we extend the decremental algorithm of SVR8, 9) in such a way that several data points can be removed more efficiently. Experimental results indicate that the proposed approach can reduce the computational cost. In particular, we observed that the number of breakpoints, which is the main computational cost of the involved path-following, were reduced from \({\mathcal O}(m)\) to \({\mathcal O}(\sqrt{m})\).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bach, F. R., Heckerman, D. and Horvitz, E., “Considering Cost Asymmetry in Learning Classiffers,” Journal of Machine Learning Research, 7, pp. 1713–1741, 2006.

    MathSciNet  Google Scholar 

  2. Cauwenberghs, G. and Poggio, T., “Incremental and Decremental Support Vector Machine Learning,” Advances in Neural Information Processing Systems 13 (Leen, T. K., Dietterich, T. G. and Tresp, V. ed.), MIT Press, pp. 409–415, 2000.

  3. Chang, C.-C. and Lin, C.-J., LIBSVM: A Library for Support Vector Machines, 2001, Software available at http://www.csie.ntu.edu.tw/_cjlin/libsvm

  4. Cristianini, N. and Shwe-Taylor, J., An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Cambridge University Press, 2000.

  5. Gal, T., Postoptimal Analysis, Parametric Programming, and Related Topics, Walter de Gruyter, 1995.

  6. Hastie, T., Rosset, S., Tibshirani, R. and Zhu, J., “The Entire Regularization Path for the Support Vector Machine,” Journal of Machine Learning Research, 5, pp. 1391–1415, 2004.

    MathSciNet  Google Scholar 

  7. Laskov, P., Gehl, C., Kruger, S. and Muller, K., “Incremental Support Vector Learning: Analysis, Implementation and Applications,” Journal of Machine Learning Research, 7, pp. 1909–1936, 2006.

    MathSciNet  Google Scholar 

  8. Ma, J. and Theiler, J., “Accurate Online Support Vector Regression,” Neural Computation, 15, 11, pp. 2683–2703, 2003.

    Article  MATH  Google Scholar 

  9. Martin, M., “On-line Support Vector Machines for Function Approximation,” Tech. Rep. LSI-02-11-R, Software Department, Universitat Politecnica de Catalunya, Spain, 2002.

  10. Schott, J. R., Matrix Analysis for Statistics, Wiley-Interscience, 2005.

  11. Takeuchi, I., Nomura, K. and Kanamori, T., “Nonparametric Conditional Density Estimation using Piecewise-linear Solution Path of Kernel Quantile Regression,” Neural Computation, 21, 2, pp. 533–559, 2009.

    Article  MATH  MathSciNet  Google Scholar 

  12. Vapnik, V., The Nature of Statistical Learning Theory, Springer-Verlag, 1995.

  13. Wang, W., Yeung, D.-Y. and Lochovsky, F. H., “A New Solution Path Algorithm in Support Vector Regression,” IEEE Transactions on Neural Networks, 19, 10, pp. 1753–1767, 2008.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masayuki Karasuyama.

About this article

Cite this article

Karasuyama, M., Takeuchi, I. & Nakano, R. Efficient Leave-m-out Cross-Validation of Support Vector Regression by Generalizing Decremental Algorithm. New Gener. Comput. 27, 307–318 (2009). https://doi.org/10.1007/s00354-008-0067-3

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00354-008-0067-3

Keywords:

Navigation