Abstract
Support vector machine (SVM) is a supervised machine learning method which can be used for both classification and regression models. In this paper, we introduce a new model of SVM and support vector regression which any of training samples containing inputs and outputs are considered the random variables with known or unknown probability functions. In this new models, we need the mathematical expectation for any of training samples but when these are unknown we apply nonparametric statistical methods. Also constraints occurrence have probability function which helps obtain maximum margin and achieve robustness. We obtain the optimal separating hyperplane and the optimal hyperplane regression by solving the quadratic optimization problems. Finally the proposed methods are illustrated by several experiments.






Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Vapnik V (1998) The nature of statistical learning theory. Springer-Verlag, New York
Drucker H, Burges CJC, Kaufman L, Smola A, Vapnik V (1997) Support vector regression machines. Adv Neural Inf Process Syst 9:155–161
Suykens J, VanGestel T, DeBrabanter J, DeMoor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore
Li H, Yang J, Zhang G, Fan B (2013) Probabilistic support vector machines for classification of noise affected data. Inf Sci 221:60–71
Gao JB, Gunn SR, Harris CJ, Brown M (2002) A probabilistic framework for SVM regression and error bar estimation. Mach Learn 46:71–89
Jinglin Y, Li HX, Yong H (2011) A probabilistic SVM based decision system for pain diagnosis. Expert Syst Appl 38:9346–9351
Trafalis TB, Gilbert RC (2006) Robust classification and regression using support vector machines. Eur J Oper Res 173:893–909
Ben-Tal A, Bhadra S, Bhattacharyya Ch, Nath JS (2011) Chance constrained uncertain classification via robust optimization. Math Progr 127:145–173
Trafalis TB, Alwazzi SA (2007) Support vector regression with noisy data: a second order cone programming approach. Int J Gen Syst 36:237–250
Lobo M, Vandenberghe L, Boyd S, Lebret H (1998) Applications of second-order cone programming. Linear Algebra Appl 284:193–228
Mehrotra S (1992) On the implementation of a primal-dual interior point method. SIAM J Opt 2:575–601
Shivaswamy PK, Bhattacharyya Ch, Smola AJ (2006) Second order cone programming approaches for handling missing and uncertain data. J Mach Learn Res 7:1283–1314
Carrizosa E, Gordillo JE, Plastria F (2008) Kernel support vector regression with imprecise output. MOSI, Vrije Univ Brussel, Brussel, Belgium, Technical Report, Dept
Park WJ, Kil RM (2009) Pattern classification with class probability output network. IEEE Trans Neural Netw 20:1659–1673
Kim S, Yua Z, Kil RM, Lee M (2014) Deep learning of support vector machines with class probability output networks. Neural Netw 64:19–28
Thi HAL, Vo XT, Dinh TP (2014) Feature selection for linear SVMs under uncertain data: robust optimization based on difference of convex functions algorithms. Neural Netw 59:36–50
Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501
Sun Y, Yuan Y, Wang G (2014) Extreme learning machine for classification over uncertain data. Neurocomputing 128:500–506
Sadoghi Yazdi H, Effati S, Saberi Z (2007) The probabilistic constraints in the support vector machine. Appl Math Comput 194:467–479
Lanckriet GRG, Ghaoui LE, Bhattacharyya Ch, Jordan MI (2002) A robust minimax approach to classification. J Mach Learn Res 3:555–582
Cortes C, Vapnik V (1995) Support vector networks. Mach Learn 20:273–297
Barnett NS, Dragomir SS (2002) Some inequalities for probability, expectation and variance of random variables defined over a finite interval. Comput Math Appl 43:1319–1357
Karr AF (1992) Probability. Springer, New York
Bache K, Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml/machine-learning-databases/
Hastie TJ, Tibshirani RJ (1990) Generalized additive models. CRC Press. http://www.dcc.fc.up.pt/~ltorgo/Regression/DataSets.html
Efron B (1979) Bootstrap methods: another look at the jackknife. Ann Stat 7:1–26
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Abaszade, M., Effati, S. Stochastic Support Vector Machine for Classifying and Regression of Random Variables. Neural Process Lett 48, 1–29 (2018). https://doi.org/10.1007/s11063-017-9697-0
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-017-9697-0