Skip to main content
Log in

Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

We modify the first order algorithm for convex programming described by Nesterov in his book (in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, 2004). In his algorithms, Nesterov makes explicit use of a Lipschitz constant L for the function gradient, which is either assumed known (Nesterov in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, 2004), or is estimated by an adaptive procedure (Nesterov 2007). We eliminate the use of L at the cost of an extra imprecise line search, and obtain an algorithm which keeps the optimal complexity properties and also inherit the global convergence properties of the steepest descent method for general continuously differentiable optimization. Besides this, we develop an adaptive procedure for estimating a strong convexity constant for the function. Numerical tests for a limited set of toy problems show an improvement in performance when compared with the original Nesterov’s algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bertsekas D.P., Nedić A., Ozdaglar A.E.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)

    MATH  Google Scholar 

  2. Lan, G., Lu, Z., Monteiro, R.D.C.: Primal-dual first-order methods with \({{O}(1/\epsilon)}\) iteration-complexity for cone programming. Technical report, School of ISyE, Georgia Tech, USA, Accepted in Mathematical Programming (2006)

  3. Moré, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. Technical Report MCS-P330-1092, Mathematics and Computer Science Division, Argonne National Laboratory (1992)

  4. Nemirovsky A.S., Yudin D.B.: Problem Complexity and Method Efficiency in Optimization. Wiley, New York (1983)

    MATH  Google Scholar 

  5. Nesterov Y.: Introductory Lectures on Convex Optimization. A Basic Course. Kluwer, Boston (2004)

    MATH  Google Scholar 

  6. Nesterov Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  7. Nesterov, Y.: Gradient methods for minimizing composite objective function. Discussion paper 76, CORE, UCL, Belgium (2007)

  8. Nesterov Y.: Smoothing technique and its applications in semidefinite optimization. Math. Program. 110(2), 245–259 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  9. Nesterov Y., Polyak B.T.: Cubic regularization of Newton method and its global performance. Math. Program. 108(1), 177–205 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  10. Nocedal J., Wright S.J.: Numerical Optimization Springer Series in Operations Research. Springer, Berlin (1999)

    Google Scholar 

  11. Polyak B.T.: Introduction to Optimization. Optimization Software Inc., New York (1987)

    Google Scholar 

  12. Richtárik P.: Improved algorithms for convex minimization in relative scale. SIAM J. Optim. 21(3), 1141–1167 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  13. Shor N.: Minimization Methods for Non-differentiable Functions. Springer, Berlin (1985)

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Elizabeth W. Karas.

Additional information

Clóvis C. Gonzaga: Research done while visiting LIMOS—Université Blaise Pascal, Clermont-Ferrand, France. This author is supported by CNPq and PRONEX—Optimization. Elizabeth W. Karas: Research done while visiting the Tokyo Institute of Technology, Global Edge Institute. Supported by CNPq, PRONEX—Optimization and MEXT’s program.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gonzaga, C.C., Karas, E.W. Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming. Math. Program. 138, 141–166 (2013). https://doi.org/10.1007/s10107-012-0541-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-012-0541-z

Keywords

Mathematics Subject Classification

Navigation