Abstract
In this paper, a novel regression algorithm coined flexible support vector regression is proposed. We first model the insensitive zone in classic support vector regression, respectively, by its up- and down-bound functions and then give a kind of generalized parametric insensitive loss function (GPILF). Subsequently, based on GPILF, we propose an optimization criterion such that the unknown regressor and its up- and down-bound functions can be found simultaneously by solving a single quadratic programming problem. Experimental results on both several publicly available benchmark data sets and time series prediction show the feasibility and effectiveness of the proposed method.





Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Vapnik VN (1998) Statistical learning theory. Wiley, New York
Schölkopf B, Smola AJ (2002) Learning with kernels. MIT Press, Cambridge
Vapnik VN (1999) The nature of statistical learning theory, 2nd edn. Springer, New York
Joachims T (1998) Text categorization with support vector machines: learning with many relevant features [A]. In: European conference on machine learning no. 10[C] 1398. Chemnitz, Springer, pp 137–142
Cao L, Tay FEH (2001) Financial forecasting using support vector machines. Neural Comput Appl 10(2):184–192
Osuna E, Freund R, Girosi F (1997) Training support vector machines: an application to face detection [A]. In: Proceedings of the 1997 conference computer vision and pattern recognition[C]. IEEE Computer Society, Washington, pp 130–136
Platt JC (1998) Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel methods—support vector machines. Cambridge
Joachims T (1999) Making large-scale SVM learning practical. In: Advances in Kernel methods support vector machine. Cambridge
Collobert R, Bengio S (2001) SVMTorch: support vector machines for large-scale regression problems. J Mach Learn 1(2):143–160
Chang CC, Lin CJ, LIBSVM: a library for support vector machines. Available from. http://www.csie.ntu.edu.tw/∼cjlin
Yang HQ, Chan LW, King I (2002) Support vector machine regression for volatile stock market prediction. In Intelligent data engineering and automated learning (IDEAL 2002). Springer, NewYork, 2412 of LNCS, pp 391–396
Cao LJ, Chua KS, Guan LK (2003) Ascending support vector machines for financial time series forecasting. In: International conference on computational intelligence for financial engineering (CIFEr2003). pp 329–335
Schölkopf B, Smola AJ, Williamson R, Bartlett PL (2000) New support vector algorithms. Neural Comput 12(5):1207–1245
Hao PY (2010) New support vector algorithms with parametric insensitive/margin model. Neural Netw 23:60–73
Huang KZ, Yang HQ, King I, Lyu M (2008) Machine learning: modeling data locally and globally. In: Advanced topics in science and tecnology in China, 1st edn. Springer, Berlin, ISBN-13: 978-3540794516. Zhejiang University Press, Hangzhou, ISBN-10: 540794514
Yang HQ, Huang KZ, King I, Lyu MR (2009) Localized support vector regression for time series prediction. Neurocomputing 72:2659–2669
Chen XB, Yang J, Liang J, Ye QL (2010) Smooth twin support vector regression. Neural Comput Appl. doi:10.1007/s00521-010-0454-9
Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge Univ. Press, Cambridge
Fung G, Mangasarian OL (2001) Proximal support vector machine classifiers. In: Proceedings KDD-2001: knowledge discovery and data mining. San Francisco, pp 77–86
Mangasarian OL, Musicant DR (1999) Successive overrelaxation for support vector machines. IEEE Trans Neural Netw 10(5):1032–1037
Muphy PM, Aha DW (1992) UCI repository of machine learning databases
The MOSEK Optimization Tools Version 5.0, Denmark. [Online]. Available: http://www.mosek.com (2008)
Balasundaram S, Kapil (2010) On Lagrangian support vector regression. Expert Syst Appl 37:8784–8792
Guo XC, Yang JH, Wu CG, Wang CY, Liang YC (2008) A novel LS-SVMs hyper-parameter selection based on particle swarm optimization. Neurocomputing 71:3211–3215
Acknowledgments
The authors would like to thank the anonymous reviewers for their critical and constructive comments and suggestions. The authors are also thankful to Scientific Foundation of Jiangsu province (BK2010339) and Natural Science Fund for Colleges and Universities in Jiangsu Province (10KJD580001).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Chen, X., Yang, J. & Liang, J. A flexible support vector machine for regression. Neural Comput & Applic 21, 2005–2013 (2012). https://doi.org/10.1007/s00521-011-0623-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-011-0623-5