Abstract
Support Vector Regression (SVR) solves regression problems based on the concept of Support Vector Machine (SVM). In this paper, we introduce a novel model of SVR in which any training samples containing inputs and outputs are considered the random variables with known or unknown distribution functions. Constraints occurrence have a probability density function which helps to obtain maximum margin and achieve robustness. The optimal hyperplane regression can be obtained by solving a quadratic optimization problem. The proposed method is illustrated by several experiments including artificial data sets and real-world benchmark data sets.

Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Vapnik V (1995) The nature of statistical learning theory. Springer, New York
Sadoghi Yazdi H, Effati S, Saberi Z (2007) The probabilistic constraints in the support vector machine. Appl Math Comput 194:467–479
Lanckriet GRG, Ghaoui LE, Bhattacharyya C h, Jordan MI (2002) A robust minimax approach to classification. J Mach Learn Res 3:555–582
Drucker H, Burges CJC, Kaufman L, Smola A, Vapnik V (1997) Support vector regression machines Advances in neural information processing systems, vol 9, pp 155–161
Mohamed RM, Farag A (2003) Classification of multispectral data using support vector machines approach for density estimation. In: IEEE 7th international conference on intelligent engineering systems INES03. Assiut, Egypt
Vapnik V, Golowich SE, Smola A (1996) Support vector method for function approximation, regression estimation and signal processing Advances in neural information processing systems, vol 9, pp 281–287
Vapnik V, Golowich SE, Smola A (1999) Support vector method for multivariate density estimation Advances in neural information processing systems, vol 12, pp 659–665
Smola AJ, Scholkopf B (2004) A tutorial on support vecotr regression. Stat Comput 14:199–222
Trafalis TB, Alwazzi SA (2007) Support vector regression with noisy data: a second order cone programming approach. Int J General Syst 36:237–250
Shivaswamy PK, Bhattacharyya C h, Smola AJ (2006) Second order cone programming approaches for handling missing and uncertain data. J Mach Learn Res 7:1283–1314
Carrizosa E, Gordillo JE, Plastria F (2008) Kernel support vector regression with imprecise output. Dept. MOSI, Vrije Univ. Brussel Brussel, Belgium, Tech. Rep
Trafalis TB, Gilbert RC (2006) Robust classification and regression using support vector machines. Eur J Oper Res 173:893–909
Chuang C, Su S, Jeng J, Hsiao C (2002) Robust support vector regression networks for function approximation with outliers. IEEE Trans Neural Netw 13:1322–1330
Ben-Tal A, Bhadra S, Bhattacharyya C h, Nath JS (2011) Chance constrained uncertain classification via robust optimization. Math Program 127:145–173
Bhattacharyya C h, Pannagadatta K, Smola A (2004) A second order cone programming formulation for classifying missing data. In: Advances in neural information processing systems combridge. MIT Press, MA
Lobo M, Vandenberghe L, Boyd S, Lebret H (1998) Applications of second-order cone programming. Linear Algebra Appl 284:193–228
Mehrotra S (1992) On the implementation of a primal-dual interior point method. SIAM J Opt 2:575–601
Hong DH, Hwang C (2003) Support vector fuzzy regression machines. Fuzzy Sets Syst 138:271–281
Hong DH, Hwang C (2004) Extended fuzzy regression models using regularization method. Inf Sci 164:31–36
Lin CF, Wang SD (2004) Training algorithms for fuzzy support vector machines with noisy data. Pattern Recognit 25:1647–1656
Baser F, Apaydin A (2015) Hybrid fuzzy support vector regression analysis. J Intell Fuzzy Syst 28:2037–2045
Hwong C, Hong D, Seok KH (2006) Support vector interval regression machine for crisp input and output data. Fuzzy Sets Syst 157:1114–1125
Jeng J, Chuang C, Su S (2003) Support vector interval regression networks for interval regression analysis. Fuzzy Sets Syst 138:283–300
Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501
Liu Q, He Q, Shi Z (2008) Extreme support vector machine classifier. Advances in Knowledge Discovery and Data Mining. Springer, pp 222–233
Zhu W, Miao J, Qing L (2014) Robust regression with extreme support vectors. Pattern Recogn Lett 45:205–210
Suykens J, VanGestel T, DeBrabanter J, DeMoor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore
Brabanter KD, Brabanter JD, Suykens JAK, Moor BD (2011) Approximate confidence and prediction intervals for least squares support vector regression. IEEE Trans Neural Netw 22:110–120
Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Networks 23(3):365–372
Xu Y, Wang L (2012) A weighted twin support vector regression. Knowl-Based Syst 33:92–101
Zhao Y -P, Zhao J, Zhao M (2013) Twin least squares support vector regression. Neurocomputing 118:225–236
Suykens JAK, De Brabanter J, Lukas L, Vandewalle J (2002) Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing 48:85–105
Hao P-Y (2010) New support vector algorithms with parameteric insensitive/margin model. Neural Netw 23 (1):60–73
Scholkopf B, Smola A, Williamson R, Bartlett PL (2000) New support vector algorithms. Neural Comput 12:1207–1245
Peng X (2012) Efficient twin parametric insensitive support vector regression model. Neurocomputing 79:26–38
Bennett KP, Mangasarian OL (1992) Robust linear programming discrimination of two linearly inseparable sets. Optim Methods Softw 1:23–34
Cortes C, Vapnik V (1995) Support vector networks. Mach Learn 20:273–297
Mangasiran OL (1969) Nonlinear programming. McGraw-Hill, New York
Vandebei RJ (1997) LOQO Users manualversion 3.10 Technical Report SOR-97-08, Princeton University, Statistics and Operations Research
Barnett NS, Dragomir SS (2002) Some inequalities for probability, expectation and variance of random variables defined over a finite interval. Comput Math Appl 43:1319–1357
Karr A F (1992) Probability. New York
Hastie TJ, Tibshirani RJ (1990) Generalized additive models. Chapman and hall/CRC. http://www.dcc.fc.up.pt/ltorgo/regression/datasets.html
Bache K, Lichman M (2013) UCI Machine learning repository. http://archive.ics.uci.edu/ml/machine-learning-databases/
Efron B (1979) Bootstrap methods: another look at the jackknife. Ann Stat 7:1–26
Hong DH, Hwang C (2003) Support vector fuzzy regression machines. Fuzzy Sets Syst 138(2):271–281
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Abaszade, M., Effati, S. Stochastic support vector regression with probabilistic constraints. Appl Intell 48, 243–256 (2018). https://doi.org/10.1007/s10489-017-0964-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-017-0964-6