Elsevier

Neurocomputing

Volume 118, 22 October 2013, Pages 225-236
Neurocomputing

Twin least squares support vector regression

https://doi.org/10.1016/j.neucom.2013.03.005Get rights and content

Abstract

In this paper, combining the spirit of twin hyperplanes with the fast speed of least squares support vector regression (LSSVR) yields a new regressor, termed as twin least squares support vector regression (TLSSVR). As a result, TLSSVR outperforms normal LSSVR in the generalization performance, and as opposed to other algorithms of twin hyperplanes, TLSSVR owns faster computational speed. When coping with large scale problems, this advantage is obvious. To accelerate the testing speed of TLSSVR, TLSSVR is sparsified using a simple mechanism, thus obtaining STLSSVR. In addition to introducing these algorithms above, a lot of experiments including a toy problem, several small and large scale data sets, and a gas furnace example are done. These applications demonstrate the effectiveness and efficiency of the proposed algorithms.

Section snippets

Motivation

Support vector machine (SVM)[1], [2], [3], rooted in the statistical learning theory and the Vapnik-Chervonenkis dimensional theory, has shown good generalization performance and successfully obtained a wide spectrum of applications in walks of life ranging from feature selection [4], [5], [6], density estimation [7], [8] to function approximation [9], [10]. Compared to other machine learning methods such as artificial neural networks [11], SVM shows several merits. For example, on one hand,

TSVR and LSSVR

In this section, we will give a concise description of TSVR and LSSVR. Considering training data set {(xi,di)}i=1N of size N randomly generated from the unknown regression function f(x), where xip×1 represents the input vector variable, di is the corresponding target value. For the sake of simplicity, let matrix X=[x1,,xN]T and d=[d1,d2,,dN]T, where the superscript T represents the transpose symbol.

Twin least squares support vector regression

Following the idea of forming two hyperplanes in TSVR, the down-bound function f1(x) of TLSSVR is constructed by solving the following optimization problem:minw,ξ,b{12wTw+C2i=1Nviξi2}s.t.di=wTφ(xi)+b+ε1+ξi,i=1,,Nwhere vi+. Compared with Eq. (12), except the insensitive parameter ε1 the only difference between them is the additional parameter vi, which is employed to weight the slack error ξi. The larger the weighted factor vi is, the smaller the slack error ξi. As vi is increasing, the ε

Experiments

In this section, we will do experiments to validate the effectiveness and feasibility of the proposed algorithms in this paper. All experiments are carried out on a personal notebook with Inter® Core i3-2310 M CPU @ 2.0 GHz processor, 2.00 GB memory, and Windows 7 operation system in MATLAB 7.0 environment such that the same platform is provided for simulations. As for kernel function, the commonly-used Gaussian is chosen, i.e. k(xi,xj)=e(xixj2/2γ2). Here, γ is a tuned parameter and

Conclusions

Recently, the research on regressors and classifiers based on the spirit of twin hyperplanes has attracted a great deal of attention due to their good generalization performance and low computational costs. However, the computational complexity of some learning machines such as TWSVM and TSVR is also high, because they also need to solve quadratical programming problems. Therefore, when they are utilized to deal with large scale problems, this bottleneck is obvious, even prohibitive. As we

Acknowledgments

This research was partially supported by the National Natural Science Foundation of China under Grant no.51006052, and the NUST Outstanding Scholar Supporting Program. Moreover, the authors wish to thank the anonymous reviewers for their constructive comments and great help in the writing process, which improve the manuscript significantly.

Yong-Ping Zhao received his B.S. degree in the thermal energy and power engineering field from Nanjing University of Aeronautics and Astronautics, Nanjing, China, in July 2004. Since then, he had been working toward the M.S. and Ph. D. degrees in kernel methods at Nanjing University of Aeronautics and Astronautics. In December 2009, He received Ph. D. degree. Currently, he is an associate professor and with the ZNDY of ministerial key laboratory, Nanjing University of Science & Technology. His

References (49)

  • Y. Xu et al.

    A weighted twin support vector regression

    Knowl.-based Syst.

    (2012)
  • M. Singh et al.

    Reduced twin support vector regression

    Neurocomputing

    (2011)
  • J.A.K. Suykens et al.

    Weighted least squares support vector machines: robustness and sparce approximation

    Neurocomputing

    (2002)
  • S. An et al.

    Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression

    Pattern Recognition

    (2007)
  • V.N. Vapnik

    The Nature of Statistical Learning Theory

    (1995)
  • C. Cortes et al.

    Support-vector networks

    Mach. Learn.

    (1995)
  • B. Schölkopf et al.

    Learning with Kernels

    (2002)
  • B.-Y. Sun et al.

    Combined feature selection and cancer prognosis using support vector machine regression

    ACM Trans. Comput. Biol. Bioinformatics

    (2011)
  • Y. Xu et al.

    New feature selection method based on support vector machines for text categorisation

    Int. J. Data Anal. Tech. Strategies

    (2011)
  • X. Teng, J. Yuan, H. Yu, Probability density estimation based on SVM, in: Proceedings of 2009 Global Mobile Congress,...
  • X. Shan, J. Zhou, F. Xiao, Support vector machine method for multivariate density estimation based on copulas, in:...
  • C.-C. Chuang et al.

    Robust support vector regression networks for function approximation with outliers

    IEEE Trans. Neural Networks

    (2002)
  • T. Jung et al.

    Experiments in value function approximation with sparse support vector regression

    Lect. Notes Artif. Intell.

    (2004)
  • B.D. Ripley

    Pattern Recognition and Neural Networks

    (2005)
  • Cited by (46)

    • Robust twin support vector regression based on rescaled Hinge loss

      2020, Pattern Recognition
      Citation Excerpt :

      To solve this problem, a weighted version of TSVR was proposed [26]. Similarly, least-square variant of TSVR was also proposed [27]. It was observed that although the TSVR is four times faster than the SVR method [27], it suffers from the following limitations (see [27]):

    • EigenSample: A non-iterative technique for adding samples to small datasets

      2018, Applied Soft Computing Journal
      Citation Excerpt :

      We now suggest a formulation that permits a trade-off between the norm of the solution ∥zi∥ and the approximation error, while also preserving the lower and upper bound constraints. The formulation has a flavor similar to that of Support Vector Regression (SVR) [34], and is thus interesting in that it suggests other formulations, just as the literature is replete with a diversity of SVR formulations such as least squares SVRs [35–38], Twin SVRs [39–44], evolutionary SVR [45], fuzzy SVR [46,47], Bayesian SVR [48], and smooth SVR [49,50] among others. In this section, we illustrate the working of EigenSample for augmenting two datasets, a synthetic dataset in ten dimensions, and a dataset of handwritten digit images.

    • Twin support vector machines: A survey

      2018, Neurocomputing
      Citation Excerpt :

      TLSSVR owns faster computational speed. Zhao et al. [142] proposed a new regressor, termed as ɛ-twin support vector regression (ɛ-TSVR). ɛ-TSVR determined a pair of ɛ-insensitive proximal functions by solving two related SVM-type problems.

    • PTSVRs: Regression models via projection twin support vector machine

      2018, Information Sciences
      Citation Excerpt :

      It constructs the regressor function by simultaneously minimizing the fitting loss and one-side ϵ-insensitive loss, but also introduces a pair of regularization terms to improve the smoothness of the regressor. Zhao et al. [39] extended TSVR to a least squares version by combining the idea in LS-SVM. In [24], we further presented a twin parametric-insensitive SVR (TPISVR) algorithm which determines the regressor by the parametric-insensitive bound functions in the spirit of our twin parametric-margin SVM (TPMSVM) [23] and par-ν-SVR [12].

    View all citing articles on Scopus

    Yong-Ping Zhao received his B.S. degree in the thermal energy and power engineering field from Nanjing University of Aeronautics and Astronautics, Nanjing, China, in July 2004. Since then, he had been working toward the M.S. and Ph. D. degrees in kernel methods at Nanjing University of Aeronautics and Astronautics. In December 2009, He received Ph. D. degree. Currently, he is an associate professor and with the ZNDY of ministerial key laboratory, Nanjing University of Science & Technology. His research interests include machine learning and kernel methods.

    Jing Zhao received the B.Eng. degree in Weifang University, China, in 2012. She is currently pursuing the M.Eng. degree from Nanjing University of Science and Technology, China. Her research interests include machine learning, pattern recognition, etc.

    Min Zhao received her B.S. degree in Nanjing Normal University, China, in 2007. She received M.S. degree in the systems engineering field from Nanjing University of Science and Technology, Nanjing, China, in 2009. Her research interests include machine learning, automation, etc.

    View full text