Abstract:
Composite function optimization problem often arises in machine learning known as regularized empirical minimization. We introduce the acceleration technique to the Newto...Show MoreMetadata
Abstract:
Composite function optimization problem often arises in machine learning known as regularized empirical minimization. We introduce the acceleration technique to the Newton-type proximal method and propose a novel algorithm called accelerated proximal subsampled Newton method (APSSN). APSSN only subsamples a small subset of samples to construct an approximate Hessian that achieves computational efficiency. At the same time, APSSN still keeps a fast convergence rate. Furthermore, we obtain the scaled proximal mapping by solving its dual problem using the semismooth Newton method instead of resorting to the first-order methods. Due to our sampling strategy and the fast convergence rate of the semismooth Newton method, we can get the scaled proximal mapping efficiently. Both our theoretical analysis and empirical study show that APSSN is an effective and computationally efficient algorithm for composite function optimization problems.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 32, Issue: 10, October 2021)