Skip to main content
Log in

Reductive and effective discriminative information-based nonparallel support vector machine

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

In the paper, to improve the performance of discriminative information-based nonparallel support vector machine (DINPSVM), we propose a novel algorithm called reductive and effective discriminative information-based nonparallel support vector machine (REDINPSVM). First, we introduce the regularization term to achieve the structural risk minimization principle. This embodies the marrow of statistical learning theory, so this modification can enhance the generalization ability of classification algorithms. Second, we apply the k-nearest neighbor method to eliminate some redundant constraints that would cut down on time complexity. Finally, to accelerate the computation, we introduce the least squares technique to solve two systems of linear equations. Comprehensive experimental results on twenty-three UCI benchmark datasets and six Image datasets demonstrate the validity of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. The prior data distribution information refers to the information obtained from the characteristic spatial distribution of known data.

  2. The prior discriminant information refers to the information obtained by bringing the known samples into the discriminant function.

  3. The space here refers to the feature space, and the prior spatial distribution information represents the information obtained from the characteristic spatial distribution of known data.

  4. http://archive.ics.uci.edu/ml/datasets.html

  5. https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/binary/skin_nonskin

  6. http://vision.stanford.edu/aditya86/ImageNetDogs/

References

  1. Bhavan A, Chauhan P, Hitkul SRR (2019) Bagged support vector machines for emotion recognition from speech. Knowl-Based Syst 184:104886

    Article  Google Scholar 

  2. Chang C C, Lin C J (2011) Libsvm: a library for support vector machines. ACM Trans Intell Syst Technol 2(3):1–27

    Article  Google Scholar 

  3. Chen G, Fu K, Qiang W, Tu E, Jie Y (2014) Semi-supervised classification with pairwise constraints. Neurocomputing 139:130–137

    Article  Google Scholar 

  4. Demiar J, Schuurmans D (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(1):1–30

    MathSciNet  Google Scholar 

  5. Gomez-Verdejo V, Martinez-Ramon M, Arenas-Garcia J, Lazaro-Gredilla M, Molina-Bulla H (2011) Support vector machines with constraints for sparsity in the primal parameters. IEEE Trans Neural Netw 22(8):1269–1283

    Article  Google Scholar 

  6. Hong Z, Yiu C (2012) Semi-supervised maximum margin clustering with pairwise constraints. IEEE Trans Knowl Data Eng 24(5):926–939

    Article  Google Scholar 

  7. Huang H, Wei X, Zhou Y (2018) Twin support vector machines: a survey. Neurocomputing 300(26):34–43

    Article  Google Scholar 

  8. Hui W, Xi L (2009) Application of fuzzy neural network to the flood season precipitation forecast. Knowl-Based Syst 83:58–65

    Google Scholar 

  9. Hui X, Chen S, Qiang Y (2008) Structural support vector machine. Int Symp Neural Netw 5263:501–511

    Google Scholar 

  10. Jayadeva KR, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  Google Scholar 

  11. Khemchandani R, Sharma S (2016) Robust least squares twin support vector machine for human activity recognition. Appl Soft Comput:33–46

  12. Kumar M A, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543

    Article  Google Scholar 

  13. Lomakina B E I (2011) Support vector machine regression (svr/ls-svm)–an alternative to neural networks (ann) for analytical chemistry comparison of nonlinear methods on near infrared (nir) spectroscopy data. Analyst 136(8):1703–1712

    Article  Google Scholar 

  14. Lu S, Wang H, Zhou Z (2019) All-in-one multicategory ramp loss maximum margin of twin spheres support vector machine. Appl Intell 49(6):2301–2314

    Article  Google Scholar 

  15. Maggini M, Melacci S, Sarti L (2012) Learning from pairwise constraints by similarity neural networks. Neural Netw 26:141–158

    Article  Google Scholar 

  16. Mir A, Nasiri J A (2018) Knn-based least squares twin support vector machine for pattern classification. Appl Intell 48(12):4551–4564

    Article  Google Scholar 

  17. Miyamoto S, Terami A (2010) Semi-supervised agglomerative hierarchical clustering algorithms with pairwise constraints. IEEE Int Conf Fuzzy Syst:1–6

  18. Nishijima M, Nieuwenhoff N, Pires R, Oliveira P R (2020) Movie films consumption in brazil: an analysis of support vector machine classification. AI and Soc 35(2):451–457

    Article  Google Scholar 

  19. Pan X, Luo Y, Xu Y (2015) K-nearest neighbor based structural twin support vector machine. Knowl-Based Syst 88:34–44

    Article  Google Scholar 

  20. Qha B, Jz B, Ll C, Yw A, Ling J B (2019) Discriminative information-based nonparallel support vector machine. Signal Process 162:169–179

    Article  Google Scholar 

  21. Qi Z, Tian Y, Yong S (2013) Structural twin support vector machine for classification. Knowl-Based Syst 43(2):74–81

    Article  Google Scholar 

  22. Shao Y H, Zhang C H, Wang X B, Deng N Y (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968

    Article  Google Scholar 

  23. Shao Y H, Wang Z, Chen W J, Deng N Y (2013) Least squares twin parametric-margin support vector machine for classification. Appl Intell 39(3):451–464

    Article  Google Scholar 

  24. Sharma S, Rastogi R, Chandra S (2021) Large-scale twin parametric support vector machine using pinball loss function. IEEE Trans Syst Man Cybern Syst 51(2):987–1003

    Article  Google Scholar 

  25. Tanveer M, Sharma A, Suganthan PN (2019a) General twin support vector machine with pinball loss function. Inf Sci 494:311–327

  26. Tanveer M, Tiwari A, Chou D, Hary R, Jalan S (2019b) Sparse pinball twin support vector machines. Appl Soft Comput 78:164–175

  27. Tian Y, Qi Z, Ju X, Shi Y, Liu X (2014) Nonparallel support vector machines for pattern classification. IEEE Trans Cybern 44(7):1067–1079

    Article  Google Scholar 

  28. Vapnik VN (1995) The Nature of Statistical Learning Theory. Springer

  29. Wang H, Zhou Z, Xu Y (2017) An improved ν-twin bounded support vector machine. Appl Intell 48(3):1–13

    Google Scholar 

  30. Xie F, Xu Y (2019) An efficient regularized k-nearest neighbor structural twin support vector machine. Appl Intell 49(12):4258–4275

    Article  Google Scholar 

  31. Xu Y, Pan X, Zhou Z, Yang Z, Zhang Y (2015) Structural least square twin support vector machine for classification. Appl Intell 42(3):527–536

    Article  Google Scholar 

  32. Ye Q, Zhao C, Gao S, Hao Z (2012) Weighted twin support vector machines with local information and its application. Neural Netw 35:31–39

    Article  Google Scholar 

  33. Yu D, Xu Z, Wang X (2020) Bibliometric analysis of support vector machines research trend: a case study in china. Int J Mach Learn Cybern 11(3):715–728

    Article  Google Scholar 

  34. Yu J, Tao D, Yong R, Cheng J (2013) Pairwise constraints based multiview features fusion for scene classification. Pattern Recogn 46(2):483–496

    Article  Google Scholar 

  35. Zhao J, Min C, Zhao Z, Luo Q (2010) Localized pairwise constraint proximal support vector machine. IEEE Int Conf Cogn Inf:908–913

  36. Zhao Z, Ning Y (2009) Constraint projections for discriminative support vector machines. Int Joint Conf Bioinform:501–507

  37. Zhu Y, Wang Z, Gao D (2015) Matrixized learning machine with modified pairwise constraints. Pattern Recogn 48(11):3797–3809

    Article  Google Scholar 

Download references

Acknowledgements

We gratefully thank the anonymous reviewers for their helpful comments and suggestions. This work was supported in part by the National Natural Science Foundation of China (No. 12071475) and the Fundamental Research Funds for the Central Universities (No. BLX201928).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhijian Zhou.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, C., Wang, H. & Zhou, Z. Reductive and effective discriminative information-based nonparallel support vector machine. Appl Intell 52, 8259–8278 (2022). https://doi.org/10.1007/s10489-021-02874-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-02874-6

Keywords

Navigation