Skip to main content
Log in

Joint rescaled asymmetric least squared nonparallel support vector machine with a stochastic quasi-Newton based algorithm

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

A host of literatures have shown that the standard support vector machine (SVM), induced by the hinge loss function, is unstable for re-sampling (feature noise) and sensitive to outliers (label noise). In this paper, we propose a novel rescaled asymmetric least squared geometric twin parametric-margin SVM (RaLS-GTPSVM) for binary classification issues with feature noise and label noise. The constructed method is consistent and can obtain two nonparallel boundary hyperplanes using one SVM-type optimization problem. We theoretically discuss several properties of the RaLS-GTPSVM, such as the outlier insensitivity and the Fisher consistency. A stochastic quasi-Newton (SQN) -based half-quadratic (HQ) algorithm is implemented for solving the RaLS-GTPSVM. The convergence of the SQN-based HQ procedure is proved. To demonstrate the superiority of the proposed RaLS-GTPSVM, we conduct extensive numerical studies on both synthetic and real datasets, in comparison with several existing famous methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    Article  MATH  Google Scholar 

  2. Chen X, Huang W, Zhao C, Tian Y (2020) Rain detection from x-band marine radar images: A support vector machine-based approach. IEEE Trans Geosci Remote Sens 58(3):2115–2123

    Article  Google Scholar 

  3. Liu L, Mei H, Guo C, Tu Y, Wang L (2021) Pixel-level classification of pollution severity on insulators using photothermal radiometry and multiclass semisupervised support vector machine. IEEE Transactions on Industrial Informatics 17(1):441–449

    Article  Google Scholar 

  4. Hayashi T, Fujita H (2021) One-class ensemble classifier for data imbalance problems. Appl Intell. https://doi.org/10.1007/s10489-021-02671-1

  5. Wang J, Wu L, Wang H, Choo KKR, He D (2021) An efficient and privacy-preserving outsourced support vector machine training for internet of medical things. IEEE Internet Things J. 8(1):458–473

    Article  Google Scholar 

  6. Vilela LFS, Leme RC, Pinheiro CAM, Carpinteiro O (2019) Forecasting financial series using clustering methods and support vector regression. Artif Intell Rev 52:743–773

    Article  Google Scholar 

  7. Hao P (2010) New support vector algorithms with parametric insensitive/margin model. Neural Netw 23(1):60–73

    Article  MATH  Google Scholar 

  8. Mangasarian OL, Wild EW (2006) Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(1):69–74

    Article  Google Scholar 

  9. Jayadeva, Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(5):905–910

    Article  MATH  Google Scholar 

  10. Peng X (2011) Tpmsvm: a novel twin parametric-margin support vector machine for pattern recognition. Pattern Recogn 44:2678–2692

    Article  MATH  Google Scholar 

  11. Shao Y, Chen W, Deng N (2014) Nonparallel hyperplane support vector machine for binary classification problems. Inf Sci 263:22–35

    Article  MathSciNet  MATH  Google Scholar 

  12. Tian Y, Ju X (2015) Nonparallel support vector machine based on one optimization problem for pattern recognition. Journal of the Operations Research Society of China 18(4):1–21

    MathSciNet  MATH  Google Scholar 

  13. Zhao J, Yang Z, Xu Y (2016) Nonparallel least square support vector machine for classification. Appl Intell 45(4):1–10

    Article  Google Scholar 

  14. Zhao J, Xu Y, Fujita H (2019) An improved non-parallel universum support vector machine and its safe sample screening rule. Knowl-Based Syst 170:79–88

    Article  Google Scholar 

  15. Qi K, Yang H (2021) Elastic net nonparallel hyperplane support vector machine and its geometrical rationality. IEEE Transactions on Neural Networks and Learning Systems. https://doi.org/10.1109/TNNLS.2021.3084404

  16. Wahba G (1999) Support vector machines, reproducing kernel hilbert spaces and the randomized gacv. Advances in Kernel Methods-Support Vector Learning 6:69–87

    Google Scholar 

  17. Xu G, Cao Z, Hu B, Principe JC (2017) Robust support vector machines based on the rescaled hinge loss function. Pattern Recogn 63:139–148

    Article  MATH  Google Scholar 

  18. Bi J, Zhang T (2004) Support vector classification with input data uncertainty. In: Advances in Neural Information Processing Systems, pp 161–168

  19. Huang X, Shi L, Suykens JAK (2014) Support vector machine classifier with pinball loss. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(5):984–997

    Article  Google Scholar 

  20. Huang X, Lei S, Suykens JAK (2014) Asymmetric least squares support vector machine classifiers. Computational Statistics & Data Analysis 70(2):395–405

    Article  MathSciNet  MATH  Google Scholar 

  21. Balasundaram S, Meena Y (2019) Robust support vector regression in primal with asymmetric huber loss. Neural Process Lett 49(3):1399–1431

    Article  Google Scholar 

  22. Liu M, Shao Y, Li C, Chen W (2020) Smooth pinball loss nonparallel support vector machine for robust classification. Appl Soft Comput 98:106840

    Article  Google Scholar 

  23. Li K, Lv Z (2021) Smooth twin bounded support vector machine with pinball loss. Appl Intell, pp 1–17

  24. Wang H, Xu Y, Zhou Z (2020) Twin-parametric margin support vector machine with truncated pinball loss. Neural Comput & Applic 33:3781–3798

    Article  Google Scholar 

  25. Sharma S, Rastogi R, Chandra S (2021) Large-scale twin parametric support vector machine using pinball loss function. IEEE Transactions on Systems, Man, and Cybernetics: Systems 51(2):987–1003

    Article  Google Scholar 

  26. Wang H, Xu Y (2021) Sparse elastic net multi-label rank support vector machine with pinball loss and its applications. Appl Soft Comput 104(9):107232

    Article  Google Scholar 

  27. Frenay B, Verleysen M (2014) Classification in the presence of label noise: A survey. IEEE Transactions on Neural Networks and Learning Systems 25(5):845–869

    Article  MATH  Google Scholar 

  28. Wu Y, Liu Y (2007) Robust truncated hinge loss support vector machines. J Am Stat Assoc 102(479):974–983

    Article  MathSciNet  MATH  Google Scholar 

  29. Shen X, Niu L, Qi Z, Tian Y (2017) Support vector machine classifier with truncated pinball loss. Pattern Recogn 68:199–210

    Article  Google Scholar 

  30. Wang C, Ye Q, Ye N, Luo P, Fu L (2019) Robust capped l1-norm twin support vector machine. Neural Netw 114:47–59

    Article  MATH  Google Scholar 

  31. Singh A, Pokharel R, Principe J (2014) The c-loss function for pattern classification. Pattern Recogn 47(1):441–453

    Article  MATH  Google Scholar 

  32. Santamaria I, Pokharel PP, Principe JC (2006) Generalized correlation function: definition, properties, and application to blind equalization. IEEE Transactions on Signal Processing 54(6):2187–2197

    Article  MATH  Google Scholar 

  33. Liu W, Pokharel PP, Principe JC (2007) Correntropy: Properties and applications in non-gaussian signal processing. IEEE Transactions on Signal Processing 55(11):5286–5298

    Article  MathSciNet  MATH  Google Scholar 

  34. Xu G, Hu B, Principe J (2018) Robust c-loss kernel classifiers. IEEE Transactions on Neural Networks and Learning Systems 29(3):510–522

    Article  MathSciNet  Google Scholar 

  35. Yang L, Dong H (2019) Robust support vector machine with generalized quantile loss for classification and regression. Appl Soft Comput 81:105483

    Article  Google Scholar 

  36. Lin Y (2004) A note on margin-based loss functions in classification. Statistics & Probability Letters 68(1):73– 82

    Article  MathSciNet  MATH  Google Scholar 

  37. Nikolova M, Ng MK (2005) Analysis of half-quadratic minimization methods for signal and image recovery. SIAM Journal on Scientific computing 27(3):937–966

    Article  MathSciNet  MATH  Google Scholar 

  38. Mokhtari A, Ribeiro A (2014) A quasi-newton method for large scale support vector machines. In: 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp 8302–8306

  39. Mokhtari A, Ribeiro A (2014) Res: Regularized stochastic bfgs algorithm. IEEE Transactions on Signal Processing 62(23):6089–6104

    Article  MathSciNet  MATH  Google Scholar 

  40. Nemirovski A, Juditsky A, Lan G, Shapiro A (2009) Robust stochastic approximation approach to stochastic programming. SIAM J Optim 19(4):1574–1609

    Article  MathSciNet  MATH  Google Scholar 

  41. Tang J, Li J, Xu W, Tian Y, Ju X, Zhang J (2021) Robust cost-sensitive kernel method with blinex loss and its applications in credit risk evaluation. Neural Netw 143:327–344

    Article  Google Scholar 

  42. Peng X, Chen D, Kong L (2014) A clipping dual coordinate descent algorithm for solving support vector machines. Knowl-Based Syst 71(nov.):266–278

    Article  Google Scholar 

  43. Lee Y, Mangasarian OL (2001) RSVM: Reduced support vector machines. In: Proceedings of the 2001 SIAM International Conference on Data Mining (SDM), pp 1–17

  44. Shalev-Shwartz S, Singer Y, Srebro N, Cotter A (2011) Pegasos: Primal estimated sub-gradient solver for svm. Mathematical programming 1:3–30

    Article  MathSciNet  MATH  Google Scholar 

  45. Wang X, Ma S, Goldfarb D, Liu W (2017) Stochastic quasi-newton methods for nonconvex stochastic optimization. SIAM J Optim 27(2):927–956

    Article  MathSciNet  MATH  Google Scholar 

  46. Yang M, Milzarek A, Wen Z, Zhang T (2021) A stochastic extra-step quasi-newton method for nonsmooth nonconvex optimization. Math Program. https://doi.org/10.1007/s10107-021-01629-y

  47. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China [Grant No. 11671059].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hu Yang.

Ethics declarations

Conflict of Interests

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Qi, K., Yang, H. Joint rescaled asymmetric least squared nonparallel support vector machine with a stochastic quasi-Newton based algorithm. Appl Intell 52, 14387–14405 (2022). https://doi.org/10.1007/s10489-022-03183-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03183-2

Keywords

Navigation