Loading [a11y]/accessibility-menu.js
A General Wrapper Approach to Selection of Class-Dependent Features | IEEE Journals & Magazine | IEEE Xplore

A General Wrapper Approach to Selection of Class-Dependent Features


Abstract:

In this paper, we argue that for a C-class classification problem, C 2-class classifiers, each of which discriminating one class from the other classes and having a chara...Show More

Abstract:

In this paper, we argue that for a C-class classification problem, C 2-class classifiers, each of which discriminating one class from the other classes and having a characteristic input feature subset, should in general outperform, or at least match the performance of, a C-class classifier with one single input feature subset. For each class, we select a desirable feature subset, which leads to the lowest classification error rate for this class using a classifier for a given feature subset search algorithm. To fairly compare all models, we propose a weight method for the class-dependent classifier, i.e., assigning a weight to each model's output before the comparison is carried out. The method's performance is evaluated on two artificial data sets and several real-world benchmark data sets, with the support vector machine (SVM) as the classifier , and with the RELIEF, class separability, and minimal-redundancy-maximal-relevancy (mRMR) as attribute importance measures. Our results indicate that the class-dependent feature subsets found by our approach can effectively remove irrelevant or redundant features, while maintaining or improving (sometimes substantially ) the classification accuracy, in comparison with other feature selection methods.
Published in: IEEE Transactions on Neural Networks ( Volume: 19, Issue: 7, July 2008)
Page(s): 1267 - 1278
Date of Publication: 09 July 2008

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.