Skip to main content
Log in

An improved robust and sparse twin support vector regression via linear programming

  • Foundations
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Twin support vector regression (TSVR) was proposed recently as a novel regressor that tries to find a pair of nonparallel planes, i.e. \(\epsilon \)-insensitive up- and down-bounds, by solving two related SVM-type problems. Though TSVR exhibits good performance compared with conventional methods like SVR, it suffers from the following issues: (1) it lacks model complexity control and thus may incur overfitting and suboptimal solution; (2) it needs to solve a pair of quadratic programming problems which are relatively complex to implement; (3) it is sensitive to outliers; and (4) its solution is not sparse. To address these problems, we propose in this paper a novel regression algorithm termed as robust and sparse twin support vector regression. The central idea is to reformulate TSVR as a convex problem by introducing regularization technique first and then derive a linear programming (LP) formulation which is not only simple but also allows robustness and sparseness. Instead of solving the resulting LP problem in the primal, we present a Newton algorithm with Armijo step-size to resolve the corresponding exact exterior penalty problem. The experimental results on several publicly available benchmark data sets show the feasibility and effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge University Press, Cambridge

  • Burges CJC (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2(2):121–167

    Article  Google Scholar 

  • Chang CC, Lin CJ (2001) LIBSVM: a library for support vectormachines. Available from http://www.csie.ntu.edu.tw/~cjlin

  • Chen X, Yang J, Ye Q, Liang J (2011) Recursive projection twin support vector machine via within-class variance minimization. Pattern Recognit 44:2643–2655

    Article  MATH  Google Scholar 

  • Chen X, Yang J, Liang J, Ye Q (2012) Smooth twin support vector regression. Neural Comput Appl 21:505–513

    Article  Google Scholar 

  • Chen X, Yang J, Liang J, Ye Q (2012) Recursive robust least squares support vector regression based on maximum correntropy criterion. Neurocomputing 97:63–73

    Article  Google Scholar 

  • Collobert R, Bengio S (2001) SVMTorch: support vectormachines for large-scale regression problems. J Mach Learn Res 2:143–160

    MathSciNet  Google Scholar 

  • Ding C, Zhou D, He X, Zha H (2006) R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings of international conference on machine learning

  • Dmitris A, Manfred WP (2001) Linear optimization and extensions: problems and extensions. Springer, Berlin

  • Fung GM, Mangasarian OL (2004) A feature selection newton method for support vector machine classification. Comput Optim Appl 28:185–202

    Article  MATH  MathSciNet  Google Scholar 

  • Fung G, Mangasarian OL (2001) Proximal Support Vector Machine Classifiers. In: Lee FPD, Srikant R (eds) Knowledge discovery and data mining. Association for computing machinery, San Francisco, pp 77–86

  • Hiriart-Urruty JB, Strodiot JJ, Nguyen VH (1984) Generalized hessian matrix and second-order optimality conditions for problems with CL1 data. Appl Math Optim 11:43–56

  • Jayadeva RK, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  Google Scholar 

  • Joachims T (1999) Making large-scale SVM learning practical. In: Advances in Kernel methods support vector machine. Cambridge.

  • Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36:7535–7543

    Article  Google Scholar 

  • Kwak N (2008) Principal component analysis based on L1-norm maximization. IEEE Trans Pattern Anal Mach Intell 30:1672–1680

    Article  Google Scholar 

  • Mangasarian OL, Meyer RR (1979) Nonlinear perturbation of linear programs. SIAM J Control Optim 17:745–752

  • Mangasarian OL, Wild EW (2006) Multisurface proximal support vector classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

  • Mangasarian OL (2006) Exact 1-norm support vector machines via unconstrained convex differentiable minimization. J Mach Learn Res 7:1517–1530

  • Muphy PM, Aha DW (1992) UCI repository of machine learning databases

  • Nocedal J, Wright SJ (2006) Numerical optimization. 2nd edn., Springer, Berlin

  • Osuna E, Freund R, Girosi F (1997) An improved training algorithm for support vector machines. In: Proceedings of neural networks for signal processing, vol VII, New York

  • Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23:365–372

    Article  Google Scholar 

  • Platt JC (1998) Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel methods support vector machines. Cambridge

  • Schlkopf B, Smola AJ (2002) Learning with kernels. MIT Press, Cambridge

    Google Scholar 

  • Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9:293–300

    Article  MathSciNet  Google Scholar 

  • Tibshirani R (1996) Regression shrinkage and selection via the lasso. J Royal Stat Soc B 58:267–288

    MATH  MathSciNet  Google Scholar 

  • Tikhonov AN, Arsen VY (1977) Solutions of Ill-posed problems. Wiley, New York

    MATH  Google Scholar 

  • Vapnik VN (1995) The nature of statistical theory. Springer, New York

    Book  MATH  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the editor and the anonymous reviewers for their critical and constructive comments and suggestions. This work was partially supported by the National Natural Science Foundation of China (Grant No. 61203244, 51108209 and 61272211) and the ministry of transportation of China (Grant No. 2013-364-836-900).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaobo Chen.

Additional information

Communicated by A. Castiglione.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, X., Yang, J. & Chen, L. An improved robust and sparse twin support vector regression via linear programming. Soft Comput 18, 2335–2348 (2014). https://doi.org/10.1007/s00500-014-1342-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-014-1342-5

Keywords

Navigation