As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Consider the classification task of assigning a test instance to one of two or more possible classes. An intuitive way to proceed is to assign the instance to that class, to which the distance is minimal. If one considers the distance to the convex hull of a class as a distance measure, then the resulting classification method is the Nearest Convex Hull (NCH) classifier. There are two key issues with this method per se that severely restrict its applicability, which we solve in this paper: first, how to handle class overlap, and second, how to provide (nonlinear) solutions with better generalization ability. The first problem is handled via using so-called kernel functions and slack variables. The second problem is dealt with using a penalization term that suppresses too complex solutions. We call the resulting method the soft-NCH classifier. In spirit and computationally the method is close to the popular Support Vector Machine (SVM) classifier and can be viewed as an instance-based large-margin classification technique. Advantages of the soft-NCH classifier include its robustness to outliers, good generalization ability and naturally easy handling of multi-class problems. We compare the performance of soft-NCH against state-of-art techniques and report promising results.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.