Paper
13 January 2003 Boosting and Support Vector Machines as Optimal Separators
Saharon Rosset, Ji Zhu, Trevor J. Hastie
Author Affiliations +
Proceedings Volume 5010, Document Recognition and Retrieval X; (2003) https://doi.org/10.1117/12.497492
Event: Electronic Imaging 2003, 2003, Santa Clara, CA, United States
Abstract
In this paper we study boosting methods from a new perspective. We build on recent work by Efron et al. to show that boosting approximately (and in some cases exactly) minimizes its loss criterion with an L1 constraint. For the two most commonly used loss criteria (exponential and logistic log-likelihood), we further show that as the constraint diminishes, or equivalently as the boosting iterations proceed, the solution converges in the separable case to an “L1-optimal” separating hyper-plane. This “L1-optimal” separating hyper-plane has the property of maximizing the minimal margin of the training data, as de£ned in the boosting literature. We illustrate through examples the regularized and asymptotic behavior of the solutions to the classifcation problem with both loss criteria.
© (2003) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Saharon Rosset, Ji Zhu, and Trevor J. Hastie "Boosting and Support Vector Machines as Optimal Separators", Proc. SPIE 5010, Document Recognition and Retrieval X, (13 January 2003); https://doi.org/10.1117/12.497492
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Algorithms

Associative arrays

Field emission displays

Neodymium

Data modeling

MODIS

Statistical modeling

Back to Top