Feature selection for support vector regression via Kernel penalization | IEEE Conference Publication | IEEE Xplore

Feature selection for support vector regression via Kernel penalization


Abstract:

This paper presents a novel feature selection approach (KP-SVR) that determines a non-linear regression function with minimal error and simultaneously minimizes the numbe...Show More

Abstract:

This paper presents a novel feature selection approach (KP-SVR) that determines a non-linear regression function with minimal error and simultaneously minimizes the number of features by penalizing their use in the dual formulation of SVR. The approach optimizes the width of an anisotropic RBF Kernel using an iterative algorithm based on the gradient descent method, eliminating features that have low relevance for the regression model. Our approach presents an explicit stopping criterion, indicating clearly when eliminating further features begins to affect negatively the model's performance. Experiments with two real-world benchmark problems demonstrate that our approach accomplishes the best performance compared to well-known feature selection methods using consistently a small number of features.
Date of Conference: 18-23 July 2010
Date Added to IEEE Xplore: 14 October 2010
ISBN Information:

ISSN Information:

Conference Location: Barcelona, Spain

Contact IEEE to Subscribe

References

References is not available for this document.