Abstract
Some versions of weighted (twin) support vector machines have been developed to handle the contaminated data. However, the weights of samples are generally obtained from the prior knowledge of data in advance. This article develops an adaptively weighted twin support vector machine via Bregman divergences. To better handle the contaminated data, we employ an insensitive loss function to control the fitting error of the samples in one class and introduce the weight (fuzzy membership) of each sample into the proposed model. The alternating optimization technique is utilized to solve the proposed model due to the characteristics of the model. The accelerated version of first-order methods is used to solve a quadratic programming problem, and the fuzzy membership of each sample is achieved analytically in the case of Bregman divergences. Experiments on some data sets have been conducted to show that our method gains better classification performance than previous methods, especially for the open set experiment.
Similar content being viewed by others
References
Vapnik V (1998) The nature of Statistical Learning, 2nd edn. Springer, New York
Chistianini N, Shawe-Taylor J (2000) An introduction to support vector machines, and other kernel-based learning methods. Cambridge University Press, Cambridge
Scholkopf B, Burges C, Smola A (1999) Advances in kernel methods in support vector learning. MIT Press, Cambridge
Huang X, Shi L, Suyken JAK (2014) Support vector machine classifier with the pinball loss. IEEE Trans Pattern Anal Mach Intell 36(5):984–997
Liang ZZ, Li Y (2009) Incremental support vector machine learning in the primal and its applications. Neurocomputing 72(10–12):2249–2258
Liang Z, Liu N (2014) Efficient feature scaling for support vector machines with a quadratic kernel. Neural Process Lett 39(3):235–246
Mangasarian O, Wild EW (2006) Multisurface proximal support vector classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74
Jayadeva R, Khemchandani R, Chandra S (2007) Twin support vector machine for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–1002
Peng X (2010) A v-twin support vector machine classifier and its geometric algorithms. Inf Sci 180:3863–3875
Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23:365–372
Peng X (2011) TPMSVM: a novel twin parametric-margin support vector machine for pattern recognition. Pattern Recogn 44(10):2678–2692
Peng X (2011) Building sparse twin support vector machine classifiers in primal space. Inf Sci 81(18):3967–3980
Xu Y, Xi W, Lv X, Guo R (2012) An improved least squares twin support vector machine. J Inform Comput Sci 9:1063–1071
Xu Y, Guo R (2014) An improved nu-twin support vector machine. Appl Intell 41(1):42–54
Xu Y, Wang L (2012) A weighted twin support vector regression. Knowl Based Syst 33:92–101
Chen X, Yang J, Ye Q, Liang J (2011) Recursive projection twin support vector machine via within-class variance minimization. Pattern Recogn 44(10):2643–2655
Shao YH, Deng NY, Yang ZM, Chen WJ, Wang Z (2012) Probabilistic outputs for twin support vector machines. Know Based Syst 33(1):45–51
Shao YH, Deng NY, Yang ZM (2012) Least squares recursive projection twin support vector machine for classification. Pattern Recogn 45(6):299–307
Xu Y, Wang L (2014) K-nearest neighbor-based weighted twin support vector regression. Appl Intell 40:1–11
Peng X, Xu D (2012) Twin Mahalanobis distance-based support vector machines for pattern recognition. Inf Sci 200:22–37
Khemchandani R, Goyal K, Chandra S (2016) TWSVR: regression via twin support vector machine. Neural Netw 74:14–21
Khemchandani R, Saigal P, Chandra S (2016) Improvements on ν-twin support vector machine. Neural Netw 79:97–107
Wang XB, Deng NY (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968
Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36:35–43
Kumar MA, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recogn Lett 29(13):1842–1848
Tian Y, Ju X, Qi Z, Shi Y (2014) Improved twin support vector machine. Sci China Math 57(2):417–432
Zhao YP, Zhao J, Zhao M (2013) Twin least squares support vector regression. Neurocomputing 118:225–236
Singh M, Chadha J, Ahuja P, Jayadeva R, Chandra S (2011) Reduced twin support vector regression. Neurocomputing 74:1474–1477
Chen S, Wu X, Zhang R (2016) Novel twin support vector machines for binary classification problems. Neural Process Lett 44(3):795–811
Jayadeva R, Khemchandani R, Suresh C (2017) Twin support vector machines: models, extensions and applications. Stud Comput Intell 659:1–206
Qi Z, Tian Y, Shi Y (2013) Robust twin support vector machine for pattern classification. Pattern Recogn 46:305–316
Lin CF, Wang SD (2002) Fuzzy support vector machines. IEEE Trans Neural Netw 13(2):464–471
An W, Liang M (2013) Fuzzy support vector machines based on within-class scatter for classification problem with outliers or noises. Neurocomputing 110:101–110
Wu Y, Liu Y (2007) Robust truncated hinge loss support vector machines. Jo Am Stat Assoc 102(479):974–983
Ye Q, Zhao C, Gao S, Zheng H (2012) Weighted twin support vector machines with local information and its application. Neural Netw 35:31–39
Gao S, Ye Q, Ye N (2011) 1-Norm least squares twin support vector machines. Neurocomputing 74(17):3590–3597
Tomar D, Singhal S, Agarwal S (2014) Weighted least square twin support vector machine for imbalanced dataset. Int J Database Theory Appl 7:25
Chen SG, Weu XJ (2017) Fuzzy twin support vector machines for pattern classification. Int J Mach Learn Cybern 2:336–384
Liang Z, Li YF, Xia S (2013) Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence. Pattern Recogn 46:1209–1219
Arindam B, Srujana M, Inderjit SD, Joydeep G (2005) Clustering with Bregman divergences. J Mach Learn Res 6:1705–1749
Beck A, Teboulle M (2009) A fast iterative shrinkage-thresholding algorithm for linear inverse problem. SIAM J Imag Sci 2(1):183–202
Hong M, Razaviyayn M, Luo ZQ, Pang JS (2016) A unified algorithmic framework for block-structured optimization involving big data: with applications in machine learning and signal processing. IEEE Signal Process Mag 33(1):57–77
Razaviyayn M, Hong M, Luo ZQ (2013) A unified convergence analysis of block successive minimization methods for nonsmooth Optimization. SIAM J Optim 23(2):1126–1153
Tseng P (2001) Convergence of a block coordinate descent method for nondifferential minimization. J Optim Theory Appl 109(3):475–479
Scheirer W, Rocha A, Sapkota A, Boult T (2013) Toward open set recognition. IEEE Trans PAMI 35(7):1757–1772
Acknowledgements
This work was partially supported by the Fundamental Research Funds for the Central Universities (2015XKMS084).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflicts of interest.
Rights and permissions
About this article
Cite this article
Liang, Z., Zhang, L., Liu, J. et al. Adaptively weighted learning for twin support vector machines via Bregman divergences. Neural Comput & Applic 32, 3323–3336 (2020). https://doi.org/10.1007/s00521-018-3843-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-018-3843-0