Loading [a11y]/accessibility-menu.js
SVM vs regularized least squares classification | IEEE Conference Publication | IEEE Xplore

SVM vs regularized least squares classification


Abstract:

Support vector machines (SVMs) and regularized least squares (RLS) are two recent promising techniques for classification. SVMs implement the structure risk minimization ...Show More

Abstract:

Support vector machines (SVMs) and regularized least squares (RLS) are two recent promising techniques for classification. SVMs implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. On the one hand, RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of solutions. In addition, the performance of SVMs has been well documented but little can be said of RLS. This paper applies these two techniques to a collection of data sets and presents results demonstrating virtual identical performance by the two methods.
Date of Conference: 26-26 August 2004
Date Added to IEEE Xplore: 20 September 2004
Print ISBN:0-7695-2128-2
Print ISSN: 1051-4651
Conference Location: Cambridge, UK

References

References is not available for this document.