Skip to main content
Log in

Functional iterative approaches for solving support vector classification problems based on generalized Huber loss

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Classical support vector machine (SVM) and its twin variant twin support vector machine (TWSVM) utilize the Hinge loss that shows linear behaviour, whereas the least squares version of SVM (LSSVM) and twin least squares support vector machine (LSTSVM) uses L2-norm of error which shows quadratic growth. The robust Huber loss function is considered as the generalization of Hinge loss and L2-norm loss that behaves like the quadratic L2-norm loss for closer error points and the linear Hinge loss after a specified distance. Three functional iterative approaches based on generalized Huber loss function are proposed in this paper to solve support vector classification problems of which one is based on SVM, i.e. generalized Huber support vector machine and the other two are in the spirit of TWSVM, namely generalized Huber twin support vector machine and regularization on generalized Huber twin support vector machine. The proposed approaches iteratively find the solutions and eliminate the requirements to solve any quadratic programming problem (QPP) as for SVM and TWSVM. The main advantages of the proposed approach are: firstly, utilize the robust Huber loss function for better generalization and for lesser sensitivity towards noise and outliers as compared to quadratic loss; secondly, it uses functional iterative scheme to find the solution that eliminates the need to solving QPP and also makes the proposed approaches faster. The efficacy of the proposed approach is established by performing numerical experiments on several real-world datasets and comparing the result with related methods, viz. SVM, TWSVM, LSSVM and LSTSVM. The classification results are convincing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Balasundaram S, Gupta D, Prasad SC (2017) A new approach for training Lagrangian twin support vector machine via unconstrained convex minimization. Appl Intell 46(1):124–134

    Article  Google Scholar 

  2. Batista M (2009) A note on a generalization of Sherman–Morrison–Woodbury formula. arXiv:0807.3860

  3. Borah P, Gupta D, Prasad M (2018) Improved 2-norm based fuzzy least squares twin support vector machine. In: 2018 IEEE symposium series on computational intelligence (SSCI), pp 412–419

  4. Borah P, Gupta D (2019) A two-norm squared fuzzy-based least squares twin parametric-margin support vector machine. In: Machine intelligence and signal analysis, Springer, Singapore, pp 119–134

  5. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  6. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  7. Ding S, Han Y, Yu J, Gu Y (2013) A fast fuzzy support vector machine based on information granulation. Neural Comput Appl 23(1):139–144

    Article  Google Scholar 

  8. Ding S, Wu F, Shi Z (2014) Wavelet twin support vector machine. Neural Comput Appl 25(6):1241–1247

    Article  Google Scholar 

  9. Ding S, An Y, Zhang X, Wu F, Xue Y (2017) Wavelet twin support vector machines based on glowworm swarm optimization. Neurocomputing 225:157–163

    Article  Google Scholar 

  10. Ekong U, Lam HK, Xiao B, Ouyang G, Liu H, Chan KY, Ling SH (2016) Classification of epilepsy seizure phase using interval type-2 fuzzy support vector machines. Neurocomputing 199:66–76

    Article  Google Scholar 

  11. Fung G, Mangasarian OL (2001) Proximal support vector machine classifiers, Proceedings of the international conference on knowledge discovery and data mining, San Francisco, CA, pp 77–86

    MATH  Google Scholar 

  12. Garg A, Shankhwar K, Jiang D, Vijayaraghavan V, Panda BN, Panda SS (2018) An evolutionary framework in modelling of multi-output characteristics of the bone drilling process. Neural Comput Appl 29(11):1233–1241

    Article  Google Scholar 

  13. Guo J, Yi P, Wang R, Ye Q, Zhao C (2014) Feature selection for least squares projection twin support vector machine. Neurocomputing 144:174–183

    Article  Google Scholar 

  14. Gupta D, Borah P, Prasad M (2017) A fuzzy based Lagrangian twin parametric-margin support vector machine (FLTPMSVM). In: 2017 IEEE symposium series on computational intelligence (SSCI), pp 1–7

  15. Gupta D, Richhariya B (2018) Entropy based fuzzy least squares twin support vector machine for class imbalance learning. Appl Intell 48:1–20

    Article  Google Scholar 

  16. Gupta D, Richhariya B, Borah P (2018) A fuzzy twin support vector machine based on information entropy for class imbalance learning. Neural Comput Appl. https://doi.org/10.1007/s00521-018-3551-9

    Article  Google Scholar 

  17. Huang H, Wei X, Zhou Y (2018) Twin support vector machines: a survey. Neurocomputing 300:34–43

    Article  Google Scholar 

  18. Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal machine Intell 29(5):905–910

    Article  Google Scholar 

  19. Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543

    Article  Google Scholar 

  20. Li H, Misra S, He J (2019) Neural network modeling of in situ fluid-filled pore size distributions in subsurface shale reservoirs under data constraints. Neural Comput Appl. https://doi.org/10.1007/s00521-019-04124-w

    Article  Google Scholar 

  21. Mangasarian OL, Wild EW (2001) Proximal support vector machine classifiers. In: Proceedings KDD-2001: knowledge discovery and data mining

  22. Murphy PM, Aha DW (1992) UCI repository of machine learning databases. Department of Information and Computer Science, University of California, Irvine, CA

  23. Peng X (2011) TPMSVM: a novel twin parametric-margin support vector machine for pattern recognition. Pattern Recognit 44(10–11):2678–2692

    Article  Google Scholar 

  24. Qi Z, Tian Y, Shi Y (2012) Laplacian twin support vector machine for semi-supervised classification. Neural Netw 35:46–53

    Article  Google Scholar 

  25. Ripley BD (2007) Pattern recognition and neural networks. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  26. Rozza A, Manzo M, Petrosino A (2014) A novel graph-based fisher kernel method for semi-supervised learning. In: IEEE 22nd International conference on pattern recognition (ICPR), pp 3786–3791

  27. Samui P, Kim D (2013) Least square support vector machine and multivariate adaptive regression spline for modeling lateral load capacity of piles. Neural Comput Appl 23(3–4):1123–1127

    Article  Google Scholar 

  28. Shafiabady N, Lee LH, Rajkumar R, Kallimani VP, Akram NA, Isa D (2016) Using unsupervised clustering approach to train the support vector machine for text classification. Neurocomputing 211:4–10

    Article  Google Scholar 

  29. Shao YH, Zhang CH, Wang XB, Deng NY (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968

    Article  Google Scholar 

  30. Singh D, Khan MA, Bansal A, Bansal N (2015) An application of SVM in character recognition with chain code. In: IEEE Communication, control and intelligent systems (CCIS), pp 167–171

  31. Suykens JA, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300

    Article  Google Scholar 

  32. Tong Z, Deng X, Chen H, Mei J, Liu H (2019) QL-HEFT: a novel machine learning scheduling scheme base on cloud computing environment. Neural Comput Appl. https://doi.org/10.1007/s00521-019-04118-8

    Article  Google Scholar 

  33. Vijayaraghavan V, Garg A, Gao L, Vijayaraghavan R, Lu G (2016) A finite element based data analytics approach for modeling turning process of Inconel 718 alloys. J Clean Prod 137:1619–1627

    Article  Google Scholar 

  34. Xu Q, Zhou H, Wang Y, Huang J (2009) Fuzzy support vector machine for classification of EEG signals using wavelet-based features. Med Eng Phys 31(7):858–865

    Article  Google Scholar 

  35. Zhai S, Jiang T (2015) A new sense-through-foliage target recognition method based on hybrid differential evolution and self-adaptive particle swarm optimization-based support vector machine. Neurocomputing 149:573–584

    Article  Google Scholar 

  36. Zhang X, Chen W, Wang B, Chen X (2015) Intelligent fault diagnosis of rotating machinery using support vector machine with ant colony algorithm for synchronous feature selection and parameter optimization. Neurocomputing 167:260–279

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deepak Gupta.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Borah, P., Gupta, D. Functional iterative approaches for solving support vector classification problems based on generalized Huber loss. Neural Comput & Applic 32, 9245–9265 (2020). https://doi.org/10.1007/s00521-019-04436-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-019-04436-x

Keywords

Navigation