Skip to main content

Advertisement

Log in

Least squares approach to K-SVCR multi-class classification with its applications

  • Published:
Annals of Mathematics and Artificial Intelligence Aims and scope Submit manuscript

Abstract

The support vector classification-regression machine for K-class classification (K-SVCR) is a novel multi-class classification method based on the “1-versus-1-versus-rest” structure. In this paper, we propose a least squares version of K-SVCR named LSK-SVCR. Similarly to the K-SVCR algorithm, this method assesses all the training data into a “1-versus-1-versus-rest” structure, so that the algorithm generates ternary outputs {− 1,0,+ 1}. In LSK-SVCR, the solution of the primal problem is computed by solving only one system of linear equations instead of solving the dual problem, which is a convex quadratic programming problem in K-SVCR. Experimental results on several benchmark, MC-NDC, and handwritten digit recognition data sets show that not only does the LSK-SVCR have better performance in the aspects of classification accuracy to that of K-SVCR and Twin-KSVC algorithms but also has remarkably higher learning speed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Boser, B.E., Guyon, I.M., Vapnik, V.: A training algorithm for optimal margin classifiers. In: Proceedings of the fifth annual workshop on Computational learning theory, COLT ’92, pp 144–152. Association for Computing Machinery, New York (1992)

  2. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20 (3), 273–297 (1995). https://doi.org/10.1007/BF00994018

    MATH  Google Scholar 

  3. Déniz, O., Castrillon, M., Hernández, M.: Face recognition using independent component analysis and support vector machines. Pattern Recogn. Lett. 24(13), 2153–2157 (2003)

    Article  Google Scholar 

  4. Arabasadi, Z., Alizadehsani, R., Roshanzamir, M., Moosaei, H., Yarifard, A.A.: Computer aided decision making for heart disease detection using hybrid neural network-genetic algorithm. Comput. Methods Programs Biomed. 141, 19–26 (2017)

    Article  Google Scholar 

  5. Ahmad, A.S., Hassan, M.Y., Abdullah, M.P., Rahman, H.A., Hussin, F., Abdullah, H., Saidur, R.: A review on applications of ANN and SVM for building electrical energy consumption forecasting. Renew. Sustain. Energy Rev. 33, 102–109 (2014)

    Article  Google Scholar 

  6. Shao, M., Wang, X., Bu, Z., Chen, X., Wang, Y.: Prediction of energy consumption in hotel buildings via support vector machines. Sustain. Cit. Soc., 102128 (2020)

  7. Zhao, H-, Magoulès, F.: A review on the prediction of building energy consumption. Renew. Sustain. Energy Rev. 16(6), 3586–3592 (2012)

    Article  Google Scholar 

  8. Fenn, M.B., Xanthopoulos, P., Pyrgiotakis, G., Grobmyer, S.R., Pardalos, P.M., Hench, L.L.: Raman spectroscopy for clinical oncology. Adv. Opt. Technol. 2011 (2011)

  9. Pardalos, P.M., Boginski, V.L., Vazacopoulos, A.: Data mining in biomedicine. Springer Optimization and Its Applications, vol. 7. Springer (2007)

  10. Tanveer, M., Richhariya, B., Khan, R.U., Rashid, A.H., Khanna, P., Prasad, M., Lin, C.T.: Machine learning techniques for the diagnosis of alzheimers disease: A review. ACM Trans. Multimed. Comput. Commun. Appl. (TOMM) 16 (1s), 1–35 (2020)

    Google Scholar 

  11. Mangasarian, O.L., Wild, E.W.: Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans. Pattern Anal. Mach. Intell. 28(1), 69–74 (2005)

    Article  Google Scholar 

  12. Khemchandani, R., Chandra, S., et al.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007)

    Article  Google Scholar 

  13. Bazikar, F., Ketabchi, S., Moosaei, H.: DC programming and DCA for parametric-margin ν-support vector machine. Appl. Intell. 50(6), 1763–1774 (2020)

    Article  Google Scholar 

  14. Ding, S., Shi, S., Jia, W.: Research on fingerprint classification based on twin support vector machine. IET Image Process. 14(2), 231–235 (2019)

    Article  Google Scholar 

  15. Ding, S., Zhang, N., Zhang, X., Wu, F.: Twin support vector machine: theory, algorithm and applications. Neural Comput. Appl. 28(11), 3119–3130 (2017)

    Article  Google Scholar 

  16. Ketabchi, S., Moosaei, H., Razzaghi, M., Pardalos, P.M.: An improvement on parametric ν -support vector algorithm for classification. Ann. Oper. Res. 276(1-2), 155–168 (2019)

    Article  MathSciNet  Google Scholar 

  17. Trafalis, T.B., Ince, H.: Support vector machine for regression and applications to financial forecasting. In: Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium, vol. 6, pp 348–353. IEEE (2000)

  18. Tang, L., Tian, Y., Pardalos, P.M.: A novel perspective on multiclass classification: regular simplex support vector machine. Inform. Sci. 480, 324–338 (2019)

    Article  MathSciNet  Google Scholar 

  19. Kressel, U.: Pairwise classification and support vector machines. In: Scholkopf, B., et al. (eds.) Advances in Kernel Methods: Support Vector Learning, pp 255–268. MIT Press (1998)

  20. Hsu, C-W, Lin, C-J: A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 13(2), 415–425 (2002)

    Article  Google Scholar 

  21. Angulo, C., Català, A.: K-SVCR. A multi-class support vector machine. In: López de Mántaras, R., Plaza, E. (eds.) Machine Learning: ECML 2000, LNCS, vol. 1810, pp 31–38. Springer, Berlin (2000)

  22. Xu, Y., Guo, R., Wang, L.: A twin multi-class classification support vector machine. Cogn. Comput. 5(4), 580–588 (2013). https://doi.org/10.1007/s12559-012-9179-7

    Article  Google Scholar 

  23. Nasiri, J.A., Charkari, N.M., Jalili, S.: Least squares twin multi-class classification support vector machine. Pattern Recogn. 48(3), 984–992 (2015). https://doi.org/10.1016/j.patcog.2014.09.020

    Article  Google Scholar 

  24. Tanveer, M., Sharma, A., Suganthan, P.N.: Least squares KNN-based weighted multiclass twin SVM. Neurocomputing (2020)

  25. Moosaei, H., Hladík, M.: Least squares K-SVCR multi-class classification. In: Kotsireas, I.S., Pardalos, P.M. (eds.) Learning and Intelligent Optimization, LNCS, vol. 12096, pp 117–127. Springer, Cham (2020)

  26. Lee, Y.-J., Huang, S.-Y.: Reduced support vector machines: A statistical theory. IEEE Trans. Neural Netw. 18(1), 1–13 (2007)

    Article  Google Scholar 

  27. Moosaei, H., Musicant, D.R., Khosravi, S., Hladík, M.: MC-NDC: multi-class normally distributed clustered datasets. Carleton College, University of Bojnord. (2020)

  28. Vapnik, V., Chervonenkis, A.J.: Theory of pattern recognition. Nauka (1974)

  29. Jayadeva, Khemchandani, R., Chandra, S.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007). https://doi.org/10.1109/TPAMI.2007.1068

    Article  Google Scholar 

  30. Golub, G.H., Van Loan, C.F.: Matrix computations. Johns Hopkins University Press, Baltimore (2012)

    MATH  Google Scholar 

  31. Lichman, M.: UCI machine learning repository. http://archive.ics.uci.edu/ml (2013)

  32. Hsu, C.-W., Chang, C.-C., Lin, C.-J., et al.: A practical guide to support vector classification, Taipei. https://www.csie.ntu.edu.tw/cjlin/papers/guide/guide.pdf (2003)

  33. Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media (2009)

  34. Musicant, D.: NDC: normally distributed clustered datasets. Computer Sciences Department, University of Wisconsin, Madison (1998)

  35. Friedman, M.: A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 11(1), 86–92 (1940)

    Article  MathSciNet  Google Scholar 

  36. Tanveer, M., Khan, M.A., Ho, S.-S.: Robust energy-based least squares twin support vector machines. Appl. Intell. 45(1), 174–186 (2016)

    Article  Google Scholar 

  37. Wang, H., Zhou, Z., Xu, Y.: An improved ν-twin bounded support vector machine. Appl. Intell. 48(4), 1041–1053 (2018)

    Article  Google Scholar 

  38. Iman, R.L., Davenport, J.M.: Approximations of the critical region of the fbietkan statistic. Commun. Stat. - Theory Methods 9(6), 571–595 (1980)

    Article  Google Scholar 

Download references

Acknowledgements

The authors were supported by the Czech Science Foundation Grant P403-18-04735S.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hossein Moosaei.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Moosaei, H., Hladík, M. Least squares approach to K-SVCR multi-class classification with its applications. Ann Math Artif Intell 90, 873–892 (2022). https://doi.org/10.1007/s10472-021-09747-1

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10472-021-09747-1

Keywords