Skip to main content
Log in

Stochastic support vector regression with probabilistic constraints

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Support Vector Regression (SVR) solves regression problems based on the concept of Support Vector Machine (SVM). In this paper, we introduce a novel model of SVR in which any training samples containing inputs and outputs are considered the random variables with known or unknown distribution functions. Constraints occurrence have a probability density function which helps to obtain maximum margin and achieve robustness. The optimal hyperplane regression can be obtained by solving a quadratic optimization problem. The proposed method is illustrated by several experiments including artificial data sets and real-world benchmark data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Vapnik V (1995) The nature of statistical learning theory. Springer, New York

    Book  MATH  Google Scholar 

  2. Sadoghi Yazdi H, Effati S, Saberi Z (2007) The probabilistic constraints in the support vector machine. Appl Math Comput 194:467–479

    MathSciNet  MATH  Google Scholar 

  3. Lanckriet GRG, Ghaoui LE, Bhattacharyya C h, Jordan MI (2002) A robust minimax approach to classification. J Mach Learn Res 3:555–582

    MathSciNet  MATH  Google Scholar 

  4. Drucker H, Burges CJC, Kaufman L, Smola A, Vapnik V (1997) Support vector regression machines Advances in neural information processing systems, vol 9, pp 155–161

  5. Mohamed RM, Farag A (2003) Classification of multispectral data using support vector machines approach for density estimation. In: IEEE 7th international conference on intelligent engineering systems INES03. Assiut, Egypt

    Google Scholar 

  6. Vapnik V, Golowich SE, Smola A (1996) Support vector method for function approximation, regression estimation and signal processing Advances in neural information processing systems, vol 9, pp 281–287

  7. Vapnik V, Golowich SE, Smola A (1999) Support vector method for multivariate density estimation Advances in neural information processing systems, vol 12, pp 659–665

  8. Smola AJ, Scholkopf B (2004) A tutorial on support vecotr regression. Stat Comput 14:199–222

    Article  MathSciNet  Google Scholar 

  9. Trafalis TB, Alwazzi SA (2007) Support vector regression with noisy data: a second order cone programming approach. Int J General Syst 36:237–250

    Article  MathSciNet  MATH  Google Scholar 

  10. Shivaswamy PK, Bhattacharyya C h, Smola AJ (2006) Second order cone programming approaches for handling missing and uncertain data. J Mach Learn Res 7:1283–1314

    MathSciNet  MATH  Google Scholar 

  11. Carrizosa E, Gordillo JE, Plastria F (2008) Kernel support vector regression with imprecise output. Dept. MOSI, Vrije Univ. Brussel Brussel, Belgium, Tech. Rep

  12. Trafalis TB, Gilbert RC (2006) Robust classification and regression using support vector machines. Eur J Oper Res 173:893–909

    Article  MathSciNet  MATH  Google Scholar 

  13. Chuang C, Su S, Jeng J, Hsiao C (2002) Robust support vector regression networks for function approximation with outliers. IEEE Trans Neural Netw 13:1322–1330

    Article  Google Scholar 

  14. Ben-Tal A, Bhadra S, Bhattacharyya C h, Nath JS (2011) Chance constrained uncertain classification via robust optimization. Math Program 127:145–173

    Article  MathSciNet  MATH  Google Scholar 

  15. Bhattacharyya C h, Pannagadatta K, Smola A (2004) A second order cone programming formulation for classifying missing data. In: Advances in neural information processing systems combridge. MIT Press, MA

    Google Scholar 

  16. Lobo M, Vandenberghe L, Boyd S, Lebret H (1998) Applications of second-order cone programming. Linear Algebra Appl 284:193–228

    Article  MathSciNet  MATH  Google Scholar 

  17. Mehrotra S (1992) On the implementation of a primal-dual interior point method. SIAM J Opt 2:575–601

    Article  MathSciNet  MATH  Google Scholar 

  18. Hong DH, Hwang C (2003) Support vector fuzzy regression machines. Fuzzy Sets Syst 138:271–281

    Article  MathSciNet  MATH  Google Scholar 

  19. Hong DH, Hwang C (2004) Extended fuzzy regression models using regularization method. Inf Sci 164:31–36

    Article  MathSciNet  MATH  Google Scholar 

  20. Lin CF, Wang SD (2004) Training algorithms for fuzzy support vector machines with noisy data. Pattern Recognit 25:1647–1656

    Article  Google Scholar 

  21. Baser F, Apaydin A (2015) Hybrid fuzzy support vector regression analysis. J Intell Fuzzy Syst 28:2037–2045

    Article  MathSciNet  MATH  Google Scholar 

  22. Hwong C, Hong D, Seok KH (2006) Support vector interval regression machine for crisp input and output data. Fuzzy Sets Syst 157:1114–1125

    Article  MathSciNet  MATH  Google Scholar 

  23. Jeng J, Chuang C, Su S (2003) Support vector interval regression networks for interval regression analysis. Fuzzy Sets Syst 138:283–300

    Article  MathSciNet  MATH  Google Scholar 

  24. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501

    Article  Google Scholar 

  25. Liu Q, He Q, Shi Z (2008) Extreme support vector machine classifier. Advances in Knowledge Discovery and Data Mining. Springer, pp 222–233

  26. Zhu W, Miao J, Qing L (2014) Robust regression with extreme support vectors. Pattern Recogn Lett 45:205–210

    Article  Google Scholar 

  27. Suykens J, VanGestel T, DeBrabanter J, DeMoor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore

    Book  Google Scholar 

  28. Brabanter KD, Brabanter JD, Suykens JAK, Moor BD (2011) Approximate confidence and prediction intervals for least squares support vector regression. IEEE Trans Neural Netw 22:110–120

    Article  Google Scholar 

  29. Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Networks 23(3):365–372

    Article  Google Scholar 

  30. Xu Y, Wang L (2012) A weighted twin support vector regression. Knowl-Based Syst 33:92–101

    Article  MathSciNet  Google Scholar 

  31. Zhao Y -P, Zhao J, Zhao M (2013) Twin least squares support vector regression. Neurocomputing 118:225–236

    Article  Google Scholar 

  32. Suykens JAK, De Brabanter J, Lukas L, Vandewalle J (2002) Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing 48:85–105

    Article  MATH  Google Scholar 

  33. Hao P-Y (2010) New support vector algorithms with parameteric insensitive/margin model. Neural Netw 23 (1):60–73

    Article  Google Scholar 

  34. Scholkopf B, Smola A, Williamson R, Bartlett PL (2000) New support vector algorithms. Neural Comput 12:1207–1245

    Article  Google Scholar 

  35. Peng X (2012) Efficient twin parametric insensitive support vector regression model. Neurocomputing 79:26–38

    Article  Google Scholar 

  36. Bennett KP, Mangasarian OL (1992) Robust linear programming discrimination of two linearly inseparable sets. Optim Methods Softw 1:23–34

    Article  Google Scholar 

  37. Cortes C, Vapnik V (1995) Support vector networks. Mach Learn 20:273–297

    MATH  Google Scholar 

  38. Mangasiran OL (1969) Nonlinear programming. McGraw-Hill, New York

    Google Scholar 

  39. Vandebei RJ (1997) LOQO Users manualversion 3.10 Technical Report SOR-97-08, Princeton University, Statistics and Operations Research

  40. Barnett NS, Dragomir SS (2002) Some inequalities for probability, expectation and variance of random variables defined over a finite interval. Comput Math Appl 43:1319–1357

    Article  MathSciNet  MATH  Google Scholar 

  41. Karr A F (1992) Probability. New York

  42. Hastie TJ, Tibshirani RJ (1990) Generalized additive models. Chapman and hall/CRC. http://www.dcc.fc.up.pt/ltorgo/regression/datasets.html

  43. Bache K, Lichman M (2013) UCI Machine learning repository. http://archive.ics.uci.edu/ml/machine-learning-databases/

  44. Efron B (1979) Bootstrap methods: another look at the jackknife. Ann Stat 7:1–26

    Article  MathSciNet  MATH  Google Scholar 

  45. Hong DH, Hwang C (2003) Support vector fuzzy regression machines. Fuzzy Sets Syst 138(2):271–281

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sohrab Effati.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Abaszade, M., Effati, S. Stochastic support vector regression with probabilistic constraints. Appl Intell 48, 243–256 (2018). https://doi.org/10.1007/s10489-017-0964-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-017-0964-6

Keywords

Navigation