Skip to main content
Log in

Robust truncated L\(_2\)-norm twin support vector machine

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

This paper proposes a new robust truncated L\(_2\)-norm twin support vector machine (T\(^2\)SVM), where the truncated L\(_2\)-norm is used to measure the empirical risk to make the classifiers more robust when encountering lots of outliers. Meanwhile, chance constraints are also employed to specify false positive and false negative error rates. T\(^2\)SVM considers a pair of chance constrained nonconvex nonsmooth problems. To solve these difficult problems, we propose an efficient iterative method for T\(^2\)SVM based on difference of convex functions (DC) programs and DC Algorithms (DCA). Experiments on benchmark data sets and artificial data sets demonstrate the significant virtues of T\(^2\)SVM in terms of robustness and generalization performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. SeDuMi: http://sedumi.mcmaster.ca/

References

  1. Cortes C, Vapnik V (1995) Support vector networks. Mach Learn 20:273–297

    MATH  Google Scholar 

  2. Deng N, Tian Y, Zhang C (2012) Support vector machines: optimization based theory, algorithms, and extensions. CRC Press, Boca Raton

    Book  Google Scholar 

  3. Maldonado S, López J (2014) Alternative second-order cone programming formulations for support vector classification. Inform Sci 268:328–341

    Article  MathSciNet  Google Scholar 

  4. Bai YQ, Niu BL, Chen Y (2013) New SDP models for protein homology detection with semi-supervised SVM. Optimization 62(4):561–572

    Article  MathSciNet  Google Scholar 

  5. Tian Y, Ju X, Qi Z (2014) Efficient sparse nonparallel support vector machines for classification. Neural Comput Appl 24(5):1089–1099

    Article  Google Scholar 

  6. Yu D, Xu Z, Wang X (2020) Bibliometric analysis of support vector machines research trend: a case study in China. Int J Mach Learn Cybern 11:715–728

    Article  Google Scholar 

  7. Don D, Iacob I (2020) DCSVM: fast multi-class classification using support vector machines. Int J Mach Learn Cybern 11:433–447

    Article  Google Scholar 

  8. Burges CJ (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2(2):121–167

    Article  Google Scholar 

  9. Mangasarian OL, Wild EW (2005) Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

    Article  Google Scholar 

  10. Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  Google Scholar 

  11. Sun BB, Ng Wing W.Y., Chan Patrick P.K. (2017) Improved sparse LSSVMS based on the localized generalization error model. Int J Mach Learn Cybern 8:1853–1861

    Article  Google Scholar 

  12. Hao PY (2010) New support vector algorithms with parametric insensitive/margin model. Neural Netw 23(1):60–73

    Article  Google Scholar 

  13. Kumar MA, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recogn Lett 29(13):1842–1848

    Article  Google Scholar 

  14. Peng X (2011) TPMSVM: a novel twin parametric-margin support vector machine for pattern recognition. Pattern Recogn 44(10–11):2678–2692

    Article  Google Scholar 

  15. Ye Q, Zhao C, Zhang H, Ye N (2011) Distance difference and linear programming nonparallel plane classifier. Expert Syst Appl 38(8):9425–9433

    Article  Google Scholar 

  16. Shao YH, Zhang CH, Wang XB, Deng NY (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968

    Article  Google Scholar 

  17. Li CN, Shao YH, Deng NY (2016) Robust L1-norm non-parallel proximal support vector machine. Optimization 65(1):169–183

    Article  MathSciNet  Google Scholar 

  18. Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543

    Article  Google Scholar 

  19. Tian Y, Qi Z, Ju X, Shi Y, Liu X (2013) Nonparallel support vector machines for pattern classification. IEEE Trans Cybern 44(7):1067–1079

    Article  Google Scholar 

  20. Nath JS, Bhattacharyya C (2007) Maximum margin classifiers with specified false positive and false negative error rates. In: Proceedings of the 2007 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics, pp 35–46

  21. Maldonado S, López J, Carrasco M (2016) A second-order cone programming formulation for twin support vector machines. Appl Intell 45(2):265–276

    Article  Google Scholar 

  22. López J, Maldonado S, Carrasco M (2019) Robust nonparallel support vector machines via second-order cone programming. Neurocomputing 364:227–238

    Article  Google Scholar 

  23. Rezvani S, Wang X, Pourpanah FI (2019) Intuitionistic fuzzy twin support vector machines. IEEE Trans Fuzzy Syst 27(11):2140–2151

    Article  Google Scholar 

  24. Kwak N (2008) Principal component analysis based on L1-norm maximization. IEEE Trans Pattern Anal Mach Intell 30(9):1672–1680

    Article  MathSciNet  Google Scholar 

  25. Meng D, Zhao Q, Xu Z (2012) Improve robustness of sparse PCA by L1-norm maximization. Pattern Recogn 45(1):487–497

    Article  Google Scholar 

  26. Peng X, Xu D, Kong L, Chen D (2016) L1-norm loss based twin support vector machine for data recognition. Inform Sci 340:86–103

    Article  MathSciNet  Google Scholar 

  27. Yan H, Ye Q, Zhang TA, Yu DJ, Yuan X, Xu Y, Fu L (2018) Least squares twin bounded support vector machines based on L1-norm distance metric for classification. Pattern Recogn 74:434–447

    Article  Google Scholar 

  28. Zhang T (2010) Analysis of multi-stage convex relaxation for sparse regularization. J Mach Learn Res 11(3):1081–1107

    MathSciNet  MATH  Google Scholar 

  29. Le Thi HA, Dinh TP (2018) DC programming and DCA: thirty years of developments. Math Program 169(1):5–68

    Article  MathSciNet  Google Scholar 

  30. Wu C, Li C, Long Q (2014) A DC programming approach for sensor network localization with uncertainties in anchor positions. J Ind Manag Optim 10(3):817

    MathSciNet  MATH  Google Scholar 

  31. Li G, Yang L, Wu Z, Wu C (2021) DC programming for sparse proximal support vector machines. Inform Sci 547:187–201

    Article  MathSciNet  Google Scholar 

  32. Tao PD, An LTH (1997) Convex analysis approach to DC programming: theory, algorithms and applications. Acta Mathematica Vietnamica 22(1):289–355

    MathSciNet  MATH  Google Scholar 

  33. Qi Z, Tian Y, Shi Y (2013) Robust twin support vector machine for pattern classification. Pattern Recogn 46(1):305–316

    Article  Google Scholar 

  34. Chen WJ, Shao YH, Li CN, Liu MZ, Wang Z, Deng NY (2020) \(\nu\)-projection twin support vector machine for pattern classification. Neurocomputing 376:10–24

    Article  Google Scholar 

  35. Hanley JA, McNeil BJ (1983) A method of comparing the areas under receiver operating characteristic curves derived from the same cases. Radiology 148(3):839–843

    Article  Google Scholar 

  36. Mamitsuka H (2006) Selecting features in microarray classification using ROC curves. Pattern Recogn 39(12):2393–2404

    Article  Google Scholar 

  37. Ye Q, Zhao C, Gao S, Zheng H (2012) Weighted twin support vector machines with local information and its application. Neural Netw 35:31–39

    Article  Google Scholar 

  38. Ding S, Hua X, Yu J (2014) An overview on nonparallel hyperplane support vector machine algorithms. Neural Comput Appl 25(5):975–982

    Article  Google Scholar 

  39. Dua D, Taniskidou EK. UCI machine learning repository. [Online]: http://archive.ics.uci.edu/ml/

  40. Alcalá-Fdez J, Fernández A, Luengo J, Derrac J, García S, Sánchez L, Herrera F (2011) Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Mult-Valued Log Soft Comput 17. [Online]: https://sci2s.ugr.es/keel/datasets.php

  41. Geng X, Zhan DC, Zhou ZH (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B (Cybernetics) 35(6):1098–1107

    Article  Google Scholar 

  42. Provost F, Fawcett T, Kohavi R (1998) The case against accuracy estimation for comparing induction algorithms. In: Proceedings of the 15th international conference on machine learning ICML-98 Morgan Kaufmann. San Mateo, CA

  43. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This research is partially supported by National Natural Science Foundation of China (Grant No.11871128), Natural Science Foundation of Chongqing (Grants No. cstc2019jcyj-msxmX0282), Scientific and Technological Research Program of Chongqing Municipal Education Commission (Grant No. KJQN201900531), and Program of Chongqing Innovation Research Group Project in University (CXQT20014) and the Innovation Project for Scientific Research of Postgraduate of Chongqing (CYS20247).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guoquan Li.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, L., Li, G., Wu, Z. et al. Robust truncated L\(_2\)-norm twin support vector machine. Int. J. Mach. Learn. & Cyber. 12, 3415–3436 (2021). https://doi.org/10.1007/s13042-021-01368-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-021-01368-8

Keywords

Navigation