Twin least squares support vector regression
Section snippets
Motivation
Support vector machine (SVM)[1], [2], [3], rooted in the statistical learning theory and the Vapnik-Chervonenkis dimensional theory, has shown good generalization performance and successfully obtained a wide spectrum of applications in walks of life ranging from feature selection [4], [5], [6], density estimation [7], [8] to function approximation [9], [10]. Compared to other machine learning methods such as artificial neural networks [11], SVM shows several merits. For example, on one hand,
TSVR and LSSVR
In this section, we will give a concise description of TSVR and LSSVR. Considering training data set of size N randomly generated from the unknown regression function f(x), where represents the input vector variable, is the corresponding target value. For the sake of simplicity, let matrix and , where the superscript T represents the transpose symbol.
Twin least squares support vector regression
Following the idea of forming two hyperplanes in TSVR, the down-bound function f1(x) of TLSSVR is constructed by solving the following optimization problem:where . Compared with Eq. (12), except the insensitive parameter the only difference between them is the additional parameter , which is employed to weight the slack error . The larger the weighted factor is, the smaller the slack error . As vi is increasing, the ε
Experiments
In this section, we will do experiments to validate the effectiveness and feasibility of the proposed algorithms in this paper. All experiments are carried out on a personal notebook with Inter® Core™ i3-2310 M CPU @ 2.0 GHz processor, 2.00 GB memory, and Windows 7 operation system in MATLAB 7.0 environment such that the same platform is provided for simulations. As for kernel function, the commonly-used Gaussian is chosen, i.e. . Here, is a tuned parameter and
Conclusions
Recently, the research on regressors and classifiers based on the spirit of twin hyperplanes has attracted a great deal of attention due to their good generalization performance and low computational costs. However, the computational complexity of some learning machines such as TWSVM and TSVR is also high, because they also need to solve quadratical programming problems. Therefore, when they are utilized to deal with large scale problems, this bottleneck is obvious, even prohibitive. As we
Acknowledgments
This research was partially supported by the National Natural Science Foundation of China under Grant no.51006052, and the NUST Outstanding Scholar Supporting Program. Moreover, the authors wish to thank the anonymous reviewers for their constructive comments and great help in the writing process, which improve the manuscript significantly.
Yong-Ping Zhao received his B.S. degree in the thermal energy and power engineering field from Nanjing University of Aeronautics and Astronautics, Nanjing, China, in July 2004. Since then, he had been working toward the M.S. and Ph. D. degrees in kernel methods at Nanjing University of Aeronautics and Astronautics. In December 2009, He received Ph. D. degree. Currently, he is an associate professor and with the ZNDY of ministerial key laboratory, Nanjing University of Science & Technology. His
References (49)
- et al.
Feature selection in the Laplacian support vector machine
Comput. Stat. Data Anal.
(2011) - et al.
Nonparallel plane proximal classifier
Signal Process.
(2009) - et al.
Application of smoothing technique on twin support vector machines
Pattern Recognition Lett.
(2008) - et al.
Least squares twin support vector machines for pattern classification
Expert Syst. Appl.
(2009) - et al.
Weighted twin support vector machines with local information and its application
Neural Netw.
(2012) - et al.
Laplacian twin support vector machine for semi-supervised classification
Neural Netw.
(2012) TSVR: an efficient twin support vector machine for regression
Neural Netw.
(2010)New support vector algorithms with parametric insensitive/margin model
Neural Netw.
(2010)Efficient twin parametric insensitive support vector regression model
Neurocomputing
(2012)Building sparse twin support vector machine classifiers in primal space
Inf. Sci.
(2011)
A weighted twin support vector regression
Knowl.-based Syst.
Reduced twin support vector regression
Neurocomputing
Weighted least squares support vector machines: robustness and sparce approximation
Neurocomputing
Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression
Pattern Recognition
The Nature of Statistical Learning Theory
Support-vector networks
Mach. Learn.
Learning with Kernels
Combined feature selection and cancer prognosis using support vector machine regression
ACM Trans. Comput. Biol. Bioinformatics
New feature selection method based on support vector machines for text categorisation
Int. J. Data Anal. Tech. Strategies
Robust support vector regression networks for function approximation with outliers
IEEE Trans. Neural Networks
Experiments in value function approximation with sparse support vector regression
Lect. Notes Artif. Intell.
Pattern Recognition and Neural Networks
Cited by (46)
On robust asymmetric Lagrangian ν-twin support vector regression using pinball loss function
2021, Applied Soft ComputingRobust twin support vector regression based on rescaled Hinge loss
2020, Pattern RecognitionCitation Excerpt :To solve this problem, a weighted version of TSVR was proposed [26]. Similarly, least-square variant of TSVR was also proposed [27]. It was observed that although the TSVR is four times faster than the SVR method [27], it suffers from the following limitations (see [27]):
EigenSample: A non-iterative technique for adding samples to small datasets
2018, Applied Soft Computing JournalCitation Excerpt :We now suggest a formulation that permits a trade-off between the norm of the solution ∥zi∥ and the approximation error, while also preserving the lower and upper bound constraints. The formulation has a flavor similar to that of Support Vector Regression (SVR) [34], and is thus interesting in that it suggests other formulations, just as the literature is replete with a diversity of SVR formulations such as least squares SVRs [35–38], Twin SVRs [39–44], evolutionary SVR [45], fuzzy SVR [46,47], Bayesian SVR [48], and smooth SVR [49,50] among others. In this section, we illustrate the working of EigenSample for augmenting two datasets, a synthetic dataset in ten dimensions, and a dataset of handwritten digit images.
Twin support vector machines: A survey
2018, NeurocomputingCitation Excerpt :TLSSVR owns faster computational speed. Zhao et al. [142] proposed a new regressor, termed as ɛ-twin support vector regression (ɛ-TSVR). ɛ-TSVR determined a pair of ɛ-insensitive proximal functions by solving two related SVM-type problems.
PTSVRs: Regression models via projection twin support vector machine
2018, Information SciencesCitation Excerpt :It constructs the regressor function by simultaneously minimizing the fitting loss and one-side ϵ-insensitive loss, but also introduces a pair of regularization terms to improve the smoothness of the regressor. Zhao et al. [39] extended TSVR to a least squares version by combining the idea in LS-SVM. In [24], we further presented a twin parametric-insensitive SVR (TPISVR) algorithm which determines the regressor by the parametric-insensitive bound functions in the spirit of our twin parametric-margin SVM (TPMSVM) [23] and par-ν-SVR [12].
Pairing support vector algorithm for data regression
2017, Neurocomputing
Yong-Ping Zhao received his B.S. degree in the thermal energy and power engineering field from Nanjing University of Aeronautics and Astronautics, Nanjing, China, in July 2004. Since then, he had been working toward the M.S. and Ph. D. degrees in kernel methods at Nanjing University of Aeronautics and Astronautics. In December 2009, He received Ph. D. degree. Currently, he is an associate professor and with the ZNDY of ministerial key laboratory, Nanjing University of Science & Technology. His research interests include machine learning and kernel methods.
Jing Zhao received the B.Eng. degree in Weifang University, China, in 2012. She is currently pursuing the M.Eng. degree from Nanjing University of Science and Technology, China. Her research interests include machine learning, pattern recognition, etc.
Min Zhao received her B.S. degree in Nanjing Normal University, China, in 2007. She received M.S. degree in the systems engineering field from Nanjing University of Science and Technology, Nanjing, China, in 2009. Her research interests include machine learning, automation, etc.