Skip to main content
Log in

Universum parametric-margin ν-support vector machine for classification using the difference of convex functions algorithm

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Universum data that do not belong to any class of a classification problem can be exploited to utilize prior knowledge to improve generalization performance. In this paper, we design a novel parametric ν-support vector machine with universum data (\( \mathfrak {U} \)Par-ν-SVM). Unlabeled samples can be integrated into supervised learning by means of \( \mathfrak {U} \)Par-ν-SVM. We propose a fast method to solve the suggested problem of \( \mathfrak {U} \)Par-ν-SVM. The primal problem of \( \mathfrak {U} \)Par-ν-SVM, which is a nonconvex optimization problem, is transformed into an unconstrained optimization problem so that the objective function can be treated as a difference of two convex functions (DC). To solve this unconstrained problem, a boosted difference of convex functions algorithm (BDCA) based on a generalized Newton method is suggested (named DC-\(\mathfrak {U} \)Par-ν-SVM). We examined our approach on UCI benchmark data sets, NDC data sets, a handwritten digit recognition data set, and a landmine detection data set. The experimental results confirmed the effectiveness and superiority of the proposed method for solving classification problems in comparison with other methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. http://people.ee.duke.edu/~lcarin/LandmineData.zip

References

  1. Arabasadi Z, Alizadehsani R, Roshanzamir M, Moosaei H, Yarifard AA (2017) Computer aided decision making for heart disease detection using hybrid neural network-Genetic algorithm. Comput Methods Prog Biomed 141:19–26

    Article  Google Scholar 

  2. AragónArtacho FJ, Vuong PT (2020) The boosted difference of convex functions algorithm for nonsmooth functions. SIAM J Optim 30(1):980–1006

    Article  MathSciNet  Google Scholar 

  3. Artacho FA, Campoy R, Vuong PT (2020) Using positive spanning sets to achieve d-stationarity with the boosted dc algorithm. Vietnam J Math 48(2):363

    Article  MathSciNet  Google Scholar 

  4. Artacho FJA, Fleming RM, Vuong PT (2018) Accelerating the DC algorithm for smooth functions. Math Program 169(1):95–118

    Article  MathSciNet  Google Scholar 

  5. Bazikar F, Ketabchi S, Moosaei H (2020) DC programming and DCA for parametric-margin ν-support vector machine. Appl Intell:1–12

  6. Chapelle O, Agarwal A, Sinz F, Schölkopf B (2007) An analysis of inference with the universum. Adv Neural Inf Process Syst 20:1369–1376

    Google Scholar 

  7. Chen X, Yang J, Liang J (2012) A flexible support vector machine for regression. Neural Comput Appl 21(8):2005–2013

    Article  Google Scholar 

  8. Chicco D (2017) Ten quick tips for machine learning in computational biology. BioData Min 10 (1):35

    Article  Google Scholar 

  9. Clarke FH (1990) Optimization and nonsmooth analysis. SIAM

  10. Daniel WW (1990) Friedman two-way analysis of variance by ranks. Appl Nonparametr Stat:262–274

  11. Ding S, An Y, Zhang X, Wu F, Xue Y (2017) Wavelet twin support vector machines based on glowworm swarm optimization. Neurocomputing 225:157–163

    Article  Google Scholar 

  12. Ding S, Shi S, Jia W (2019) Research on fingerprint classification based on twin support vector machine. IET Image Process 14(2):231–235

    Article  Google Scholar 

  13. Ding S, Zhang N, Zhang X, Wu F (2017) Twin support vector machine: theory, algorithm and applications. Neural Comput Appl 28(11):3119–3130

    Article  Google Scholar 

  14. Dua D, Graff C UCI machine learning repository (2019). https://archive.ics.uci.edu/ml

  15. Friedman M (1940) A comparison of alternative tests of significance for the problem of m rankings. Ann Math Stat 11(1):86–92

    Article  MathSciNet  Google Scholar 

  16. Hao PY (2010) New support vector algorithms with parametric insensitive/margin model. Neural Netw 23(1):60–73

    Article  Google Scholar 

  17. Hiriart-Urruty JB, Strodiot JJ, Nguyen VH (1984) Generalized Hessian matrix and second-order optimality conditions for problems with c1,1 data. Appl Math Optim 11(1):43–56

    Article  MathSciNet  Google Scholar 

  18. Hsu CW, Chang CC, Lin C et al (2003) A practical guide to support vector classification

  19. Iman RL, Davenport JM (1980) Approximations of the critical region of the fbietkan statistic. Commun Stat-Theory Methods 9(6):571–595

    Article  Google Scholar 

  20. Jayadeva KR, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  Google Scholar 

  21. Ketabchi S, Moosaei H (2012) Minimum norm solution to the absolute value equation in the convex case. J Optim Theory Appl 154(3):1080–1087

    Article  MathSciNet  Google Scholar 

  22. Ketabchi S, Moosaei H, Razzaghi M, Pardalos PM (2019) An improvement on parametric ν-support vector algorithm for classification. Ann Oper Res 276(1-2):155–168

    Article  MathSciNet  Google Scholar 

  23. LeCun Y, Boser BE, Denker JS, Henderson D, Howard RE, Hubbard WE, Jackel LD (1990) Handwritten digit recognition with a back-propagation network. In: Advances in neural information processing systems, pp 396–404

  24. Lee YJ, Mangasarian OL (2001) RSVM: Reduced support vector machines. In: Proceedings of the 2001 SIAM International Conference on Data Mining. SIAM, pp 1–17

  25. Li M, Yu X, Ryu KH, Lee S, Theera-Umpon N (2018) Face recognition technology development with gabor, pca and svm methodology under illumination normalization condition. Clust Comput 21 (1):1117–1126

    Article  Google Scholar 

  26. de Lima MD, Costa NL, Barbosa R (2018) Improvements on least squares twin multi-class classification support vector machine. Neurocomputing 313:196–205

    Article  Google Scholar 

  27. Mangasarian OL, Wild EW (2005) Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

    Article  Google Scholar 

  28. Moosaei H, Musicant D, Khosravi S, Hladík M (2020) MC-NDC: multi-class normally distributed clustered datasets. Carleton College, University of Bojnord. https://github.com/dmusican/ndc

  29. Musicant D (1998) NDC: normally distributed clustered datasets

  30. Noble WS, et al. (2004) Support vector machine applications in computational biology. Kernel Methods Comput Biol 71:92

    Google Scholar 

  31. Pardalos PM, Ketabchi S, Moosaei H (2014) Minimum norm solution to the positive semidefinite linear complementarity problem. Optimization 63(3):359–369

    Article  MathSciNet  Google Scholar 

  32. Qi Z, Tian Y, Shi Y (2012) Twin support vector machine with universum data. Neural Netw 36:112–119

    Article  Google Scholar 

  33. Schölkopf B, Smola AJ, Bach F, et al. (2002) Learning with kernels: Support vector machines, Regularization, Optimization, and Beyond. MIT Press

  34. Schölkopf B, Smola AJ, Williamson RC, Bartlett PL (2000) New support vector algorithms. Neural Comput 12(5):1207–1245

    Article  Google Scholar 

  35. Tanveer M, Khan MA, Ho SS (2016) Robust energy-based least squares twin support vector machines. Appl Intell 45(1):174–186

    Article  Google Scholar 

  36. Tanveer M, Richhariya B, Khan R, Rashid A, Khanna P, Prasad M, Lin C (2020) Machine learning techniques for the diagnosis of alzheimer’s disease: A review. ACM Trans Multimed Comput Commun Appl (TOMM) 16(1s):1–35

    Google Scholar 

  37. Tao PD, et al. (1996) Numerical solution for optimization over the efficient set by dc optimization algorithms. Oper Res Lett 19(3):117–128

    Article  MathSciNet  Google Scholar 

  38. Tian Y, Qi Z (2014) Review on: twin support vector machines. Ann Data Sci 1(2):253–277

    Article  Google Scholar 

  39. Vapnik V, Chervonenkis A (1974) Theory of pattern recognition. Moscow, Nauka

    MATH  Google Scholar 

  40. Wang H, Zhou Z, Xu Y (2018) An improved ν-twin bounded support vector machine. Appl Intell 48(4):1041–1053

    Article  Google Scholar 

  41. Weston J, Collobert R, Sinz F, Bottou L, Vapnik V (2006) Inference with the universum. In: Proceedings of the 23rd International Conference on Machine Learning, pp 1009–1016

  42. Xiao Y, Wen J, Liu B (2020) A new multi-task learning method with universum data. Appl Intell:1–14

  43. Xu Y (2016) K-nearest neighbor-based weighted multi-class twin support vector machine. Neurocomputing 205:430–438

    Article  Google Scholar 

  44. Yang Z, Xu Y (2016) Laplacian twin parametric-margin support vector machine for semi-supervised classification. Neurocomputing 171:325–334

    Article  Google Scholar 

  45. Yang Z, Xu Y (2018) A safe sample screening rule for laplacian twin parametric-margin support vector machine. Pattern Recogn 84:1–12

    Article  Google Scholar 

  46. Zhao J, Xu Y, Fujita H (2019) An improved non-parallel universum support vector machine and its safe sample screening rule. Knowl-Based Syst 170:79–88

    Article  Google Scholar 

Download references

Acknowledgements

The work of H. Moosaei and M. Hladík was supported by the Czech Science Foundation Grant P403-18-04735S.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hossein Moosaei.

Ethics declarations

Competing interests

The authors declare that they have no conflicts of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Moosaei, H., Bazikar, F., Ketabchi, S. et al. Universum parametric-margin ν-support vector machine for classification using the difference of convex functions algorithm. Appl Intell 52, 2634–2654 (2022). https://doi.org/10.1007/s10489-021-02402-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-02402-6

Keywords

Navigation