Loading [MathJax]/extensions/MathZoom.js
Greedy Projected Gradient-Newton Method for Sparse Logistic Regression | IEEE Journals & Magazine | IEEE Xplore

Greedy Projected Gradient-Newton Method for Sparse Logistic Regression

Publisher: IEEE

Abstract:

Sparse logistic regression (SLR), which is widely used for classification and feature selection in many fields, such as neural networks, deep learning, and bioinformatics...View more

Abstract:

Sparse logistic regression (SLR), which is widely used for classification and feature selection in many fields, such as neural networks, deep learning, and bioinformatics, is the classical logistic regression model with sparsity constraints. In this paper, we perform theoretical analysis on the existence and uniqueness of the solution to the SLR, and we propose a greedy projected gradient-Newton (GPGN) method for solving the SLR. The GPGN method is a combination of the projected gradient method and the Newton method. The following characteristics show that the GPGN method achieves not only elegant theoretical results but also a remarkable numerical performance in solving the SLR: 1) the full iterative sequence generated by the GPGN method converges to a global/local minimizer of the SLR under weaker conditions; 2) the GPGN method has the properties of afinite identification for an optimal support set and local quadratic convergence; and 3) the GPGN method achieves higher accuracy and higher speed compared with a number of state-of-the-art solvers according to numerical experiments.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 31, Issue: 2, February 2020)
Page(s): 527 - 538
Date of Publication: 11 April 2019

ISSN Information:

PubMed ID: 30990444
Publisher: IEEE

Funding Agency:


References

References is not available for this document.