Abstract
Aiming at the common support vector machine’s biased disadvantage and computational complexity, an unbiased least squares support vector machine (LSSVM) model is proposed in this paper. The model eliminates the bias item of LSSVM by improving the form of structure risk, then the unbiased least squares support vector classifier and the unbiased least squares support vector regression are deduced. Based on this model, we design a new learning algorithm using Cholesky factorization according to the characteristic of kernel function matrix, in this way the calculation of Lagrangian multipliers is greatly simplified. Several experiments on diffenert datasets are carried out, including the common datasets classification, synthetic aperture radar image automatic target recognition and chaotic time series prediction. The experimental results of correct recognition rate and the fitting precision testify that the unbiased LSSVM model has good universal ability and fitting accuracy, better generalization capability and stability, and have a great improvement in learning speed.
Similar content being viewed by others
References
Burges CJC (1998) A tutorial on support vector machines for pattern recognition. Data Mining Knowl Discov 2:121–167
Campbell C (2000) An introduction to kernel methods. In: Radial basis function networks: design and application. Springer, Berlin
Cesa-Bianchi N, Conconi A, Gentile C (2004) On the generalization ability of on-line learning algorithms. IEEE Trans Inf Theory 50(9):2050–2057
Fan YG, Li P, Song ZH (2006) Dynamic weighted least square support vector machines. Control Decis (in Chinese) 21(10):1129–1133
Fung G, Mangasarian O (2001a) Incremental support vector machine classification. Tech. rep., Data Mining Institute Technical Report 01-08
Fung G, Mangasarian O (2001b) Proximal support vector machine classifiers. In: Proceedings KDD-2001, knowledge discovery and data mining, San Francisco
Golub GH, Van-Loan CF (1996) Matrix computations. The Johns Hopkins University Press, Baltimore
Gustavo CV, Luis GC, Jordi MM (2006) Composite kernels for hyperspectral image classification. IEEE Geosci Remote Sens Lett 3(1):93–97
Gustavo CV, Jordi MM, José LRA (2007) Nonlinear system identification with composite relevance vector machines. IEEE Signal Process Lett 14(4):279–282
Lau KW, Wu QH (2003) Online training of support vector classifier. Patten Recognit 36(8):1913–1920
Liu QG, He Q, Shi ZZ (2007) Incremental nonlinear proximal support vector machine. In: Proceeding of ISNN’07. LNCS, vol 4493. Springer, Berlin, p 336–341
Müller KR, Mika S, Rätsch G, Tsuda K, Schölkopf B (2001) An introduction to kernel-based learning algorithms. IEEE Trans Neural Netw 12:181–201
Navia-Vázquez A, Pérez-Cruz F, Artés-Rodriguez A, Figueiras-Vidal A (2004a) Advantages of unbiased support vector classifiers for data mining applications. J VLSI Signal Process 37:223–235
Navia-Vázquez A, Pérez-Cruz F, Artés-Rodriguez A, Figueiras-Vidal A (2004b) Unbiased support vector classifiers. In: Proceedings of the IEEE signal processing society workshop, pp 183–192
Ojeda F, Suykens JAK, Moor BD (2008) Low rank updated ls-svm classifiers for fast variable selection. Neural Netw 21:437–449
Sánchez AVD (2003) Advanced support vector machines and kernel methods. Neurocomputing 55:5–20
Seeger M (2005) Low rank updates for the cholesky decomposition. Technical report. Max Planck Society, Tuebingen
Shilton A, Palaniswami M, Ralph D, Tsoi AC (2005) Incremental training of support vector machines. IEEE Trans Neural Netw 16(1):114–131
Suykens JAK, Vandewalle J (1999) Least square support vector machines classifiers. Neural Process Lett 9(3):293–300
Suykens JAK, Vandewalle J (2000) Recurrent least squares support vector machines. IEEE Trans Circuits Syst 47(7):1109–1114
Vapnik VN (1995) The nature of statistical learning theory. Springer, New York
Vijayakumar S (1999) Sequential support vector classifiers and regression. In: Proceedings of international conference on soft computing, Genoa, Italy, pp 610–619
Vishwanathan SVN, Schraudolph NN, Smola AJ (2006) Step size adaptation in reproducing kernel hilbert space. J Mach Learn Res 7:1107–1133
Wang HQ, Sun FC, Zhao ZT, CaiYN (2007) Sar image atr using svm with a low dimensional combined feature. In: Proceedings of SPIEMIPPR’07, Wuhan, China, vol 6786, p 67862J
Zhang HR,Wang XD (2006) Incremental and online learning algorithm for regression least square vector machine. Chin J Comput 29(3):400–406
Zhang HR, Zhang CJ, Wang XD (2006) A new support vector machine and its learning algorithm. In: Proceedings of the sixth world congress on control and automation, Dalian, China, pp 2820–2824
Zheng DL (2006) Research on kernel methods in machine learning. PhD thesis, Tsinghua University
Zheng DL, Wang JX, Zhao YN (2006a) Non-flat function estimation with a multi-scale support vector regession. Neurocomputing 70:420–429
Zheng DL, Wang JX, Zhao YN (2006b) Time series predictions using multi-scale support vector regressions. Lect Notes Comput Sci 3935:474–481
Acknowledgments
This work was jointly supported by the National Key Project for Basic Research of China (Grants No: G2007cb311003) and the National Natural Science Foundation of China (Grants No: 60625304, 60621062).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wang, HQ., Sun, FC., Cai, YN. et al. An unbiased LSSVM model for classification and regression. Soft Comput 14, 171–180 (2010). https://doi.org/10.1007/s00500-009-0435-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-009-0435-z