Skip to main content
Log in

Theoretically Optimal Parameter Choices for Support Vector Regression Machines with Noisy Input

  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

With the evidence framework, the regularized linear regression model can be explained as the corresponding MAP problem in this paper, and the general dependency relationships that the optimal parameters in this model with noisy input should follow is then derived. The support vector regression machines Huber-SVR and Norm-r r-SVR are two typical examples of this model and their optimal parameter choices are paid particular attention. It turns out that with the existence of the typical Gaussian noisy input, the parameter μ in Huber-SVR has the linear dependency with the input noise, and the parameter r in the r-SVR has the inversely proportional to the input noise. The theoretical results here will be helpful for us to apply kernel-based regression techniques effectively in practical applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  • Cristianini N, Shawe-Taylor J (2000) An Introduction to Support Vector Machines.Cambridge University Press

  • Vapnik V (1998) Statistical Learning Theory. Wiley, New York

  • Kwok JT, Tsang IW (2003) Linear dependency between ɛ and the input noise in ɛ-support vector regression. IEEE Trans. Neural Networks 14(3): 544–553

    Google Scholar 

  • Smola AJ, Murata N, Schölkopf B, Müller KR (1998) Asymptotically optimal choice of ɛ-loss for support vector machines. In: Proceedings of the International Conference on Artificial Neural Networks

  • Smola AJ, Schölkopf B (1998) A tutotial on support vector regression. NeuroCOLT2 Technical Report NC2-TR-1998–030, Royal Holloway College

  • Law MH, Kwok JT (2001) Bayesian support vector regression. In: Proceedings of the English International Workshop on Artificial Intelligence and Statistics, Florida pp 239–244

  • Gao JB, Gunn SR, Ham CJ (2002) A probabilistic framework for SVM regression and Error Bar Estimation. Machine Learning, 46: 71–89

    Google Scholar 

  • Cherkassky V, Ma Y (2003) Practical selection of SVM parameters and noise estimation for SVM regression. Neural Networks (in press)

  • Yan Pinfan et al (2001) Artificial neural networks and evolutionary computation, Tsinghua University Press

  • Yonghuan S (2002) Handbook of practical mathematics, Chinese Science Press

  • Wang S, Chung FL et al Note on the relationship between probabilistic/fuzzy clustering. Int J Soft Computing (accepted)

  • Wang S et al Gene selection for cancer classification using the new SVM-based technique, Chinese J Bioinformatics (accepted)

  • Wang S (1998, 2000) Fuzzy systems, fuzzy neural networks and their programming, Press of Shanghai Sci Tech, 2nd edn

  • Wang S et al Robust maximum entropy clustering algorithm RMEC and its labeling for outliers, Engineering Sci China (accepted)

  • Zhang T (2002) On the dual formulation of regularized linear systems with convex risks, Machine Learning 46: 91–129

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wang Shitong.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Shitong, W., Jiagang, Z., Chung, F. et al. Theoretically Optimal Parameter Choices for Support Vector Regression Machines with Noisy Input. Soft Comput 9, 732–741 (2005). https://doi.org/10.1007/s00500-004-406-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-004-406-3

Keywords

Navigation