skip to main content
research-article

Large-Scale Least Squares Twin SVMs

Published: 02 June 2021 Publication History

Abstract

In the last decade, twin support vector machine (TWSVM) classifiers have achieved considerable emphasis on pattern classification tasks. However, the TWSVM formulation still suffers from the following two shortcomings: (1) TWSVM deals with the inverse matrix calculation in the Wolfe-dual problems, which is intractable for large-scale datasets with numerous features and samples, and (2) TWSVM minimizes the empirical risk instead of the structural risk in its formulation. With the advent of huge amounts of data today, these disadvantages render TWSVM an ineffective choice for pattern classification tasks. In this article, we propose an efficient large-scale least squares twin support vector machine (LS-LSTSVM) for pattern classification that rectifies all the aforementioned shortcomings. The proposed LS-LSTSVM introduces different Lagrangian functions to eliminate the need for calculating inverse matrices. The proposed LS-LSTSVM also does not employ kernel-generated surfaces for the non-linear case, and thus uses the kernel trick directly. This ensures that the proposed LS-LSTSVM model is superior to the original TWSVM and LSTSVM. Lastly, the structural risk is minimized in LS-LSTSVM. This exhibits the essence of statistical learning theory, and consequently, classification accuracy on datasets can be improved due to this change. The proposed LS-LSTSVM is solved using the sequential minimal optimization (SMO) technique, making it more suitable for large-scale problems. We further proved the convergence of the proposed LS-LSTSVM. Exhaustive experiments on several real-world benchmarks and NDC-based large-scale datasets demonstrate that the proposed LS-LSTSVM is feasible for large datasets and, in most cases, performed better than existing algorithms.

References

[1]
C. Blake and C. J. Merz. 1998. UCI Repository of Machine Learning Databases, Dept. of Information and Computer Science, Univ. of California, Irvine.
[2]
Danushka Bollegala, Yutaka Matsuo, and Mitsuru Ishizuka. 2011. A web search engine-based approach to measure semantic similarity between words. IEEE Transactions on Knowledge and Data Engineering 23, 7 (2011), 977–990.
[3]
Christopher J. C. Burges. 1998. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 2 (1998), 121–167.
[4]
Chih-Chung Chang and Chih-Jen Lin. 2011. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 3, Article 27 (May 2011), 27 pages.
[5]
Feng Chu, Guosheng Jin, and Lipo Wang. 2005. Cancer diagnosis and protein secondary structure prediction using support vector machines. In Support Vector Machines: Theory and Applications. Springer, 343–363.
[6]
C. Cortes and V. Vapnik. 1995. Support vector networks. Machine Learning 20 (1995), 273–297.
[7]
Janez Demšar. 2006. Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, (Jan.2006), 1–30.
[8]
Richard O. Duda, Peter E. Hart, and David G. Stork. 2012. Pattern Classification. John Wiley & Sons.
[9]
Manuel Fernández-Delgado, Eva Cernadas, Senén Barro, and Dinani Amorim. 2014. Do we need hundreds of classifiers to solve real world classification problems?Journal of Machine Learning Research 15, 1 (Jan. 2014), 3133–3181. Retrieved from http://dl.acm.org/citation.cfm?id=2627435.2697065.
[10]
Michael Grant and Stephen Boyd. 2014. CVX: Matlab software for disciplined convex programming, version 2.1. http://cvxr.com/cvx/citing/.
[11]
Michael Grant, Stephen Boyd, and Yinyu Ye. 2009. cvx users’ guide. Technical Report, Technical Report Build 711, Citeseer.
[12]
Chih-Wei Hsu and Chih-Jen Lin. 2002. A comparison of methods for multiclass support vector machines. IEEE Transactions on Neural Networks 13, 2 (2002), 415–425.
[13]
Jayadeva, R. Khemchandani, and S. Chandra. 2007. Twin support vector machines for pattern classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 29, 5 (2007), 905–910.
[14]
S. S. Keerthi and S. K. Shevade. 2003. SMO algorithm for least squares SVM. In Proceedings of the International Joint Conference on Neural Networks, 2003, Vol. 3. IEEE, 2088–2093.
[15]
S. Sathiya Keerthi and Elmer G. Gilbert. 2002. Convergence of a generalized SMO algorithm for SVM classifier design. Machine Learning 46, 1–3 (2002), 351–360.
[16]
S. Sathiya Keerthi, Shirish Krishnaj Shevade, Chiranjib Bhattacharyya, and Karuturi Radha Krishna Murthy. 2001. Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Computation 13, 3 (2001), 637–649.
[17]
Reshma Khemchandani, Pooja Saigal, and Suresh Chandra. 2018. Angle-based twin support vector machine. Annals of Operations Research 269, 1–2 (2018), 387–417.
[18]
Reshma Khemchandani and Sweta Sharma. 2016. Robust least squares twin support vector machine for human activity recognition. Applied Soft Computing 47 (2016), 33–46.
[19]
M. A. Kumar and M. Gopal. 2009. Least squares twin support vector machines for pattern classification. Expert Systems with Applications 36 (2009), 7535–7543.
[20]
Yuanqing Li and Cuntai Guan. 2008. Joint feature re-extraction and classification using an iterative semi-supervised support vector machine algorithm. Machine Learning 71, 1 (2008), 33–53.
[21]
Dalian Liu, Dewei Li, Yong Shi, and Yingjie Tian. 2018. Large-scale linear nonparallel SVMs. Soft Computing 22, 6 (2018), 1945–1957.
[22]
O. L. Mangasarian and E. W. Wild. 2006. Multisurface proximal support vector classification via generalized eigenvalues. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 1 (2006), 69–74.
[23]
Olvi L. Mangasarian. 1994. Nonlinear Programming. SIAM.
[24]
D. R. Musicant. 1998. NDC: Normally Distributed Clustered Datasets. Retrieved from http://www.cs.wisc.edu/dmi/svm/ndc/.
[25]
Jalal A. Nasiri, Nasrollah Moghadam Charkari, and Kourosh Mozafari. 2014. Energy-based model of least squares twin support vector machines for human action recognition. Signal Processing 104 (2014), 248–257.
[26]
Edgar Osuna, Robert Freund, and Federico Girosit. 1997. Training support vector machines: An application to face detection. In Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE, 130–136.
[27]
Mahesh Pal and P. M. Mather. 2005. Support vector machines for classification in remote sensing. International Journal of Remote Sensing 26, 5 (2005), 1007–1011.
[28]
B. Richhariya, A. Sharma, and M. Tanveer. 2018. Improved universum twin support vector machine. In 2018 IEEE Symposium Series on Computational Intelligence (SSCI’18). IEEE, 2045–2052.
[29]
B. Richhariya and M. Tanveer. 2018. EEG signal classification using universum support vector machine. Expert Systems with Applications 106 (2018), 169–182.
[30]
B. Richhariya and M. Tanveer. 2021. An efficient angle based universum least squares twin support vector machine for pattern classification. ACM Transactions on Internet Technology (TOIT) (In press) (2021). https://doi.org/10.1145/3387131
[31]
B. Richhariya and M. Tanveer. 2020. A reduced universum twin support vector machine for class imbalance learning. Pattern Recognition 102 (2020), 107150.
[32]
B. Richhariya, M. Tanveer, A. H. Rashid, and Alzheimer’s Disease Neuroimaging Initiative. 2020. Diagnosis of Alzheimer’s disease using universum support vector machine based recursive feature elimination (USVM-RFE). Biomedical Signal Processing and Control 59 (2020), 101903.
[33]
Bernhard Schölkopf, Alexander J. Smola, Francis Bach, et al. 2002. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press.
[34]
Xigao Shao, Kun Wu, and Bifeng Liao. 2013. Single directional SMO algorithm for least squares support vector machines. Computational Intelligence and Neuroscience 2013, Article 968438 (2013).
[35]
Yuan-Hai Shao, Wei-Jie Chen, Zhen Wang, Chun-Na Li, and Nai-Yang Deng. 2015. Weighted linear loss twin support vector machine for large-scale classification. Knowledge-Based Systems 73 (2015), 276–288.
[36]
Yuan-Hai Shao, Chun-Na Li, Ming-Zeng Liu, Zhen Wang, and Nai-Yang Deng. 2018. Sparse Lq-norm least squares support vector machine with feature selection. Pattern Recognition 78 (2018), 167–181.
[37]
Yuan-Hai Shao, Chun-Hua Zhang, Xiao-Bo Wang, and Nai-Yang Deng. 2011. Improvements on twin support vector machines. IEEE Transactions on Neural Networks 22, 6 (2011), 962–968.
[38]
S. Sharma, R. Rastogi, and S. Chandra. 2021. Large-scale twin parametric support vector machine using Pinball loss function. IEEE Transactions on Systems, Man, and Cybernetics: Systems 51, 2 (2021), 987--1003.
[39]
Shirish K. Shevade, S. Sathiya Keerthi, Chiranjib Bhattacharyya, and Karaturi Radha Krishna Murthy. 2000. Improvements to the SMO algorithm for SVM regression. IEEE Transactions on Neural Networks 11, 5 (2000), 1188–1193.
[40]
M. Tanveer. 2015. Application of smoothing techniques for linear programming twin support vector machines. Knowledge and Information Systems 45, 1 (2015), 191–214.
[41]
M. Tanveer. 2015. Newton method for implicit Lagrangian twin support vector machines. International Journal of Machine Learning and Cybernetics 6, 6 (2015), 1029–1040.
[42]
M. Tanveer. 2015. Robust and sparse linear programming twin support vector machines. Cognitive Computation 7, 1 (2015), 137–149.
[43]
M. Tanveer, C. Gautam, and Ponnuthurai N. Suganthan. 2019. Comprehensive evaluation of twin SVM based classifiers on UCI datasets. Applied Soft Computing 83 (2019), 105617.
[44]
M. Tanveer, M. A. Khan, and S.-S. Ho. 2016. Robust energy-based least squares twin support vector machines. Applied Intelligence. 45, 1 (2016), 174–186.
[45]
M. Tanveer, T. Rajani, and M. A. Ganaie. 2019. Improved sparse pinball twin SVM. In 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC’19). IEEE, 3287–3291.
[46]
M. Tanveer, B. Richhariya, R. U. Khan, A. H. Rashid, P. Khanna, M. Prasad, and C. T. Lin. 2020. Machine learning techniques for the diagnosis of Alzheimer’s disease: A review. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 16, 1s (2020), 1–35.
[47]
M. Tanveer, A Sharma, and Ponnuthurai N. Suganthan. 2019. General twin support vector machine with pinball loss function. Information Sciences 494 (2019), 311–327.
[48]
M. Tanveer, A. Sharma, and Ponnuthurai N. Suganthan. 2020. Least squares KNN-based weighted multiclass twin SVM. Neurocomputing (In Press) (2020).
[49]
M. Tanveer and K. Shubham. 2017. Smooth twin support vector machines via unconstrained convex minimization. Filomat 31, 8 (2017), 2195–2210.
[50]
M. Tanveer, Aruna Tiwari, Rahul Choudhary, and Sanchit Jalan. 2019. Sparse pinball twin support vector machines. Applied Soft Computing 78 (2019), 164–175.
[51]
Yingjie Tian, Xuchan Ju, Zhiquan Qi, and Yong Shi. 2014. Improved twin support vector machine. Science China Mathematics 57, 2 (2014), 417–432.
[52]
Vladimir Vapnik. 2013. The Nature of Statistical Learning Theory. Springer Science & Business Media.
[53]
Zhen Wang, Yuan-Hai Shao, Lan Bai, Chun-Na Li, Li-Ming Liu, and Nai-Yang Deng. 2018. Insensitive stochastic gradient twin support vector machines for large scale problems. Information Sciences 462 (2018), 114–131.
[54]
Yitian Xu, Zhiji Yang, and Xianli Pan. 2016. A novel twin support-vector machine with pinball loss. IEEE Transactions on Neural Networks and Learning Systems 28, 2 (2016), 359–370.
[55]
Zhi-Qiang Zeng, Hong-Bin Yu, Hua-Rong Xu, Yan-Qi Xie, and Ji Gao. 2008. Fast training support vector machines using parallel sequential minimal optimization. In 2008 3rd International Conference on Intelligent System and Knowledge Engineering, Vol. 1. IEEE, 997–1001.

Cited By

View all
  • (2024)Twin support vector machines based on chaotic mapping dung beetle optimization algorithmJournal of Computational Design and Engineering10.1093/jcde/qwae04011:3(101-110)Online publication date: 22-Apr-2024
  • (2024)Credit risk assessment method driven by asymmetric loss functionApplied Soft Computing10.1016/j.asoc.2024.112355167(112355)Online publication date: Dec-2024
  • (2024)Weighted least squares twin support vector machine based on density peaksPattern Analysis & Applications10.1007/s10044-024-01311-x27:3Online publication date: 3-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Internet Technology
ACM Transactions on Internet Technology  Volume 21, Issue 2
June 2021
599 pages
ISSN:1533-5399
EISSN:1557-6051
DOI:10.1145/3453144
  • Editor:
  • Ling Liu
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2021
Online AM: 07 May 2020
Accepted: 01 May 2020
Revised: 01 April 2020
Received: 01 March 2020
Published in TOIT Volume 21, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Machine learning
  2. support vector machines (SVMs)
  3. large scale SVMs
  4. least squares twin SVM

Qualifiers

  • Research-article
  • Refereed

Funding Sources

  • Science and Engineering Research Board (SERB), India
  • Ramanujan Fellowship and Early Career Research Award Schemes
  • Council of Scientific & Industrial Research (CSIR), New Delhi, India
  • Extra Mural Research (EMR) Scheme

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)41
  • Downloads (Last 6 weeks)7
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Twin support vector machines based on chaotic mapping dung beetle optimization algorithmJournal of Computational Design and Engineering10.1093/jcde/qwae04011:3(101-110)Online publication date: 22-Apr-2024
  • (2024)Credit risk assessment method driven by asymmetric loss functionApplied Soft Computing10.1016/j.asoc.2024.112355167(112355)Online publication date: Dec-2024
  • (2024)Weighted least squares twin support vector machine based on density peaksPattern Analysis & Applications10.1007/s10044-024-01311-x27:3Online publication date: 3-Sep-2024
  • (2023)Robust twin depth support vector machine based on average depthKnowledge-Based Systems10.1016/j.knosys.2023.110627274(110627)Online publication date: Aug-2023
  • (2023)Deep multi-view multiclass twin support vector machinesInformation Fusion10.1016/j.inffus.2022.10.00591:C(80-92)Online publication date: 1-Mar-2023
  • (2022)Asymmetric and robust loss function driven least squares support vector machineKnowledge-Based Systems10.1016/j.knosys.2022.109990258:COnline publication date: 22-Dec-2022
  • (2022)Regularized Least Squares Twin SVM for Multiclass ClassificationBig Data Research10.1016/j.bdr.2021.10029527:COnline publication date: 28-Feb-2022
  • (2022)A least squares twin support vector machine method with uncertain dataApplied Intelligence10.1007/s10489-022-03897-353:9(10668-10684)Online publication date: 22-Aug-2022
  • (2022)Comprehensive review on twin support vector machinesAnnals of Operations Research10.1007/s10479-022-04575-w339:3(1223-1268)Online publication date: 8-Mar-2022
  • (2022)A lagrangian-based approach for universum twin bounded support vector machine with its applicationsAnnals of Mathematics and Artificial Intelligence10.1007/s10472-022-09783-591:2-3(109-131)Online publication date: 18-Jan-2022
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media