Elsevier

Artificial Intelligence

Volume 224, July 2015, Pages 28-50
Artificial Intelligence

Efficient nonconvex sparse group feature selection via continuous and discrete optimization

https://doi.org/10.1016/j.artint.2015.02.008Get rights and content
Under an Elsevier user license
open archive

Abstract

Sparse feature selection has proven to be effective in analyzing high-dimensional data. While promising, most existing works apply convex methods, which may be suboptimal in terms of the accuracy of feature selection and parameter estimation. In this paper, we consider both continuous and discrete nonconvex paradigms to sparse group feature selection, which are motivated by applications that require identifying the underlying group structure and performing feature selection simultaneously. The main contribution of this article is twofold: (1) computationally, we develop efficient optimization algorithms for both continuous and discrete formulations, of which the key step is a projection with two coupled constraints; (2) statistically, we show that the proposed continuous model reconstructs the oracle estimator. Therefore, consistent feature selection and parameter estimation are achieved simultaneously. Numerical results on synthetic and real-world data suggest that the proposed nonconvex methods compare favorably against their competitors, thus achieving desired goal of delivering high performance.

Keywords

Nonconvex optimization
Error bound
Discrete optimization
Application
EEG data analysis

Cited by (0)

This paper is an invited revision of a paper first published at “The 30th International Conference on Machine Learning (ICML 2013)”.