Skip to main content

Model Combination for Support Vector Regression via Regularization Path

  • Conference paper
Book cover PRICAI 2012: Trends in Artificial Intelligence (PRICAI 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7458))

Included in the following conference series:

Abstract

In order to improve the generalization performance of support vector regression (SVR), we propose a novel model combination method for SVR on regularization path. First, we construct the initial candidate model set using the regularization path, whose inherent piecewise linearity makes the construction easy and effective. Then, we elaborately select the models for combination from the initial model set through the improved Occam’s Window method and the input-dependent strategy. Finally, we carry out the combination on the selected models using the Bayesian model averaging. Experimental results on benchmark data sets show that our combination method has significant advantage over the model selection methods based on generalized cross validation (GCV) and Bayesian information criterion (BIC). The results also verify that the improved Occam’s Window method and the input-dependent strategy can enhance the predictive performance of the combination model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14(3), 199–222 (2004)

    Article  MathSciNet  Google Scholar 

  2. Chang, M.W., Lin, C.J.: Leave-one-out bounds for support vector regression model selection. Neural Computation 17(5), 1188–1222 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  3. Wang, G., Yeung, D., Lochovsky, F.: Two-dimensional solution path for support vector regression. In: Proceedings of the 23th International Conference on Machine Learning, pp. 993–1000 (2006)

    Google Scholar 

  4. Gunter, L., Zhu, J.: Efficient computation and model selection for the support vector regression. Neural Computation 19(6), 1633–1655 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  5. Wang, G., Yeung, D.Y., Lochovsky, F.H.: A new solution path algorithm in support vector regression. IEEE Transactions on Neural Networks 19(10), 1753–1767 (2008)

    Article  Google Scholar 

  6. Craven, P., Wahba, G.: Smoothing noisy data with spline functions. Numerische Mathematik 31(4), 377–403 (1978)

    Article  MathSciNet  Google Scholar 

  7. Petridis, V., Kehagias, A., Petrou, L., Bakirtzis, A., Kiartzis, S., Panagiotou, H., Maslaris, N.: A bayesian multiple models combination method for time series prediction. Journal of Intelligent and Robotic Systems 31(1), 69–89 (2001)

    Article  MATH  Google Scholar 

  8. Freund, Y., Mansour, Y., Schapire, R.E.: Why averaging classifiers can protect against overfitting. In: Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, vol. 304. Citeseer (2001)

    Google Scholar 

  9. Ji, C., Ma, S.: Combinations of weak classifiers. IEEE Transactions on Neural Networks 8(1), 32–42 (1997)

    Article  Google Scholar 

  10. Raftery, A.E., Madigan, D., Hoeting, J.A.: Bayesian model averaging for linear regression models. Journal of the American Statistical Association 92, 179–191 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  11. Kittler, J.: Combining classifiers: A theoretical framework. Pattern Analysis and Applications 1, 18–27 (1998)

    Article  Google Scholar 

  12. Evgeniou, T., Pontil, M., Elisseeff, A.: Leave one out error, stability, and generalization of voting combinations of classifiers. Machine Learning 55(1), 71–97 (2004)

    Article  MATH  Google Scholar 

  13. Bagui, S.C.: Combining pattern classifiers: methods and algorithms. Technometrics 47(4), 517–518 (2005)

    Article  Google Scholar 

  14. Kimeldorf, G., Wahba, G.: Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and Applications 33(1), 82–95 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  15. Madigan, D., Raftery, A.E.: Model selection and accounting for model uncertainty in graphical models using Occam’s window. Journal of the American Statistical Association 89(428), 1535–1546 (1994)

    Article  MATH  Google Scholar 

  16. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer (2008)

    Google Scholar 

  17. Zhao, N., Zhao, Z., Liao, S.: Probabilistic Model Combination for Support Vector Machine Using Positive-Definite Kernel-Based Regularization Path. In: Wang, Y., Li, T. (eds.) ISKE2011. AISC, vol. 122, pp. 201–206. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, M., Liao, S. (2012). Model Combination for Support Vector Regression via Regularization Path. In: Anthony, P., Ishizuka, M., Lukose, D. (eds) PRICAI 2012: Trends in Artificial Intelligence. PRICAI 2012. Lecture Notes in Computer Science(), vol 7458. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32695-0_57

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-32695-0_57

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-32694-3

  • Online ISBN: 978-3-642-32695-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics