Abstract
We modify the first order algorithm for convex programming described by Nesterov in his book (in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, 2004). In his algorithms, Nesterov makes explicit use of a Lipschitz constant L for the function gradient, which is either assumed known (Nesterov in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, 2004), or is estimated by an adaptive procedure (Nesterov 2007). We eliminate the use of L at the cost of an extra imprecise line search, and obtain an algorithm which keeps the optimal complexity properties and also inherit the global convergence properties of the steepest descent method for general continuously differentiable optimization. Besides this, we develop an adaptive procedure for estimating a strong convexity constant for the function. Numerical tests for a limited set of toy problems show an improvement in performance when compared with the original Nesterov’s algorithms.
Similar content being viewed by others
References
Bertsekas D.P., Nedić A., Ozdaglar A.E.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)
Lan, G., Lu, Z., Monteiro, R.D.C.: Primal-dual first-order methods with \({{O}(1/\epsilon)}\) iteration-complexity for cone programming. Technical report, School of ISyE, Georgia Tech, USA, Accepted in Mathematical Programming (2006)
Moré, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. Technical Report MCS-P330-1092, Mathematics and Computer Science Division, Argonne National Laboratory (1992)
Nemirovsky A.S., Yudin D.B.: Problem Complexity and Method Efficiency in Optimization. Wiley, New York (1983)
Nesterov Y.: Introductory Lectures on Convex Optimization. A Basic Course. Kluwer, Boston (2004)
Nesterov Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)
Nesterov, Y.: Gradient methods for minimizing composite objective function. Discussion paper 76, CORE, UCL, Belgium (2007)
Nesterov Y.: Smoothing technique and its applications in semidefinite optimization. Math. Program. 110(2), 245–259 (2007)
Nesterov Y., Polyak B.T.: Cubic regularization of Newton method and its global performance. Math. Program. 108(1), 177–205 (2006)
Nocedal J., Wright S.J.: Numerical Optimization Springer Series in Operations Research. Springer, Berlin (1999)
Polyak B.T.: Introduction to Optimization. Optimization Software Inc., New York (1987)
Richtárik P.: Improved algorithms for convex minimization in relative scale. SIAM J. Optim. 21(3), 1141–1167 (2011)
Shor N.: Minimization Methods for Non-differentiable Functions. Springer, Berlin (1985)
Author information
Authors and Affiliations
Corresponding author
Additional information
Clóvis C. Gonzaga: Research done while visiting LIMOS—Université Blaise Pascal, Clermont-Ferrand, France. This author is supported by CNPq and PRONEX—Optimization. Elizabeth W. Karas: Research done while visiting the Tokyo Institute of Technology, Global Edge Institute. Supported by CNPq, PRONEX—Optimization and MEXT’s program.
Rights and permissions
About this article
Cite this article
Gonzaga, C.C., Karas, E.W. Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming. Math. Program. 138, 141–166 (2013). https://doi.org/10.1007/s10107-012-0541-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-012-0541-z