Abstract
The support vector classification-regression machine for K-class classification (K-SVCR) is a novel multi-class classification method based on the “1-versus-1-versus-rest” structure. In this paper, we propose a least squares version of K-SVCR named LSK-SVCR. Similarly to the K-SVCR algorithm, this method assesses all the training data into a “1-versus-1-versus-rest” structure, so that the algorithm generates ternary outputs {− 1,0,+ 1}. In LSK-SVCR, the solution of the primal problem is computed by solving only one system of linear equations instead of solving the dual problem, which is a convex quadratic programming problem in K-SVCR. Experimental results on several benchmark, MC-NDC, and handwritten digit recognition data sets show that not only does the LSK-SVCR have better performance in the aspects of classification accuracy to that of K-SVCR and Twin-KSVC algorithms but also has remarkably higher learning speed.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Boser, B.E., Guyon, I.M., Vapnik, V.: A training algorithm for optimal margin classifiers. In: Proceedings of the fifth annual workshop on Computational learning theory, COLT ’92, pp 144–152. Association for Computing Machinery, New York (1992)
Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20 (3), 273–297 (1995). https://doi.org/10.1007/BF00994018
Déniz, O., Castrillon, M., Hernández, M.: Face recognition using independent component analysis and support vector machines. Pattern Recogn. Lett. 24(13), 2153–2157 (2003)
Arabasadi, Z., Alizadehsani, R., Roshanzamir, M., Moosaei, H., Yarifard, A.A.: Computer aided decision making for heart disease detection using hybrid neural network-genetic algorithm. Comput. Methods Programs Biomed. 141, 19–26 (2017)
Ahmad, A.S., Hassan, M.Y., Abdullah, M.P., Rahman, H.A., Hussin, F., Abdullah, H., Saidur, R.: A review on applications of ANN and SVM for building electrical energy consumption forecasting. Renew. Sustain. Energy Rev. 33, 102–109 (2014)
Shao, M., Wang, X., Bu, Z., Chen, X., Wang, Y.: Prediction of energy consumption in hotel buildings via support vector machines. Sustain. Cit. Soc., 102128 (2020)
Zhao, H-, Magoulès, F.: A review on the prediction of building energy consumption. Renew. Sustain. Energy Rev. 16(6), 3586–3592 (2012)
Fenn, M.B., Xanthopoulos, P., Pyrgiotakis, G., Grobmyer, S.R., Pardalos, P.M., Hench, L.L.: Raman spectroscopy for clinical oncology. Adv. Opt. Technol. 2011 (2011)
Pardalos, P.M., Boginski, V.L., Vazacopoulos, A.: Data mining in biomedicine. Springer Optimization and Its Applications, vol. 7. Springer (2007)
Tanveer, M., Richhariya, B., Khan, R.U., Rashid, A.H., Khanna, P., Prasad, M., Lin, C.T.: Machine learning techniques for the diagnosis of alzheimers disease: A review. ACM Trans. Multimed. Comput. Commun. Appl. (TOMM) 16 (1s), 1–35 (2020)
Mangasarian, O.L., Wild, E.W.: Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans. Pattern Anal. Mach. Intell. 28(1), 69–74 (2005)
Khemchandani, R., Chandra, S., et al.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007)
Bazikar, F., Ketabchi, S., Moosaei, H.: DC programming and DCA for parametric-margin ν-support vector machine. Appl. Intell. 50(6), 1763–1774 (2020)
Ding, S., Shi, S., Jia, W.: Research on fingerprint classification based on twin support vector machine. IET Image Process. 14(2), 231–235 (2019)
Ding, S., Zhang, N., Zhang, X., Wu, F.: Twin support vector machine: theory, algorithm and applications. Neural Comput. Appl. 28(11), 3119–3130 (2017)
Ketabchi, S., Moosaei, H., Razzaghi, M., Pardalos, P.M.: An improvement on parametric ν -support vector algorithm for classification. Ann. Oper. Res. 276(1-2), 155–168 (2019)
Trafalis, T.B., Ince, H.: Support vector machine for regression and applications to financial forecasting. In: Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium, vol. 6, pp 348–353. IEEE (2000)
Tang, L., Tian, Y., Pardalos, P.M.: A novel perspective on multiclass classification: regular simplex support vector machine. Inform. Sci. 480, 324–338 (2019)
Kressel, U.: Pairwise classification and support vector machines. In: Scholkopf, B., et al. (eds.) Advances in Kernel Methods: Support Vector Learning, pp 255–268. MIT Press (1998)
Hsu, C-W, Lin, C-J: A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 13(2), 415–425 (2002)
Angulo, C., Català, A.: K-SVCR. A multi-class support vector machine. In: López de Mántaras, R., Plaza, E. (eds.) Machine Learning: ECML 2000, LNCS, vol. 1810, pp 31–38. Springer, Berlin (2000)
Xu, Y., Guo, R., Wang, L.: A twin multi-class classification support vector machine. Cogn. Comput. 5(4), 580–588 (2013). https://doi.org/10.1007/s12559-012-9179-7
Nasiri, J.A., Charkari, N.M., Jalili, S.: Least squares twin multi-class classification support vector machine. Pattern Recogn. 48(3), 984–992 (2015). https://doi.org/10.1016/j.patcog.2014.09.020
Tanveer, M., Sharma, A., Suganthan, P.N.: Least squares KNN-based weighted multiclass twin SVM. Neurocomputing (2020)
Moosaei, H., Hladík, M.: Least squares K-SVCR multi-class classification. In: Kotsireas, I.S., Pardalos, P.M. (eds.) Learning and Intelligent Optimization, LNCS, vol. 12096, pp 117–127. Springer, Cham (2020)
Lee, Y.-J., Huang, S.-Y.: Reduced support vector machines: A statistical theory. IEEE Trans. Neural Netw. 18(1), 1–13 (2007)
Moosaei, H., Musicant, D.R., Khosravi, S., Hladík, M.: MC-NDC: multi-class normally distributed clustered datasets. Carleton College, University of Bojnord. (2020)
Vapnik, V., Chervonenkis, A.J.: Theory of pattern recognition. Nauka (1974)
Jayadeva, Khemchandani, R., Chandra, S.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007). https://doi.org/10.1109/TPAMI.2007.1068
Golub, G.H., Van Loan, C.F.: Matrix computations. Johns Hopkins University Press, Baltimore (2012)
Lichman, M.: UCI machine learning repository. http://archive.ics.uci.edu/ml (2013)
Hsu, C.-W., Chang, C.-C., Lin, C.-J., et al.: A practical guide to support vector classification, Taipei. https://www.csie.ntu.edu.tw/cjlin/papers/guide/guide.pdf (2003)
Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media (2009)
Musicant, D.: NDC: normally distributed clustered datasets. Computer Sciences Department, University of Wisconsin, Madison (1998)
Friedman, M.: A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 11(1), 86–92 (1940)
Tanveer, M., Khan, M.A., Ho, S.-S.: Robust energy-based least squares twin support vector machines. Appl. Intell. 45(1), 174–186 (2016)
Wang, H., Zhou, Z., Xu, Y.: An improved ν-twin bounded support vector machine. Appl. Intell. 48(4), 1041–1053 (2018)
Iman, R.L., Davenport, J.M.: Approximations of the critical region of the fbietkan statistic. Commun. Stat. - Theory Methods 9(6), 571–595 (1980)
Acknowledgements
The authors were supported by the Czech Science Foundation Grant P403-18-04735S.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Moosaei, H., Hladík, M. Least squares approach to K-SVCR multi-class classification with its applications. Ann Math Artif Intell 90, 873–892 (2022). https://doi.org/10.1007/s10472-021-09747-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10472-021-09747-1