Loading [a11y]/accessibility-menu.js
Accelerated Proximal Subsampled Newton Method | IEEE Journals & Magazine | IEEE Xplore

Accelerated Proximal Subsampled Newton Method


Abstract:

Composite function optimization problem often arises in machine learning known as regularized empirical minimization. We introduce the acceleration technique to the Newto...Show More

Abstract:

Composite function optimization problem often arises in machine learning known as regularized empirical minimization. We introduce the acceleration technique to the Newton-type proximal method and propose a novel algorithm called accelerated proximal subsampled Newton method (APSSN). APSSN only subsamples a small subset of samples to construct an approximate Hessian that achieves computational efficiency. At the same time, APSSN still keeps a fast convergence rate. Furthermore, we obtain the scaled proximal mapping by solving its dual problem using the semismooth Newton method instead of resorting to the first-order methods. Due to our sampling strategy and the fast convergence rate of the semismooth Newton method, we can get the scaled proximal mapping efficiently. Both our theoretical analysis and empirical study show that APSSN is an effective and computationally efficient algorithm for composite function optimization problems.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 32, Issue: 10, October 2021)
Page(s): 4374 - 4388
Date of Publication: 09 September 2020

ISSN Information:

PubMed ID: 32903188

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.