Skip to main content
Log in

On the steepest descent algorithm for quadratic functions

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

The steepest descent algorithm with exact line searches (Cauchy algorithm) is inefficient, generating oscillating step lengths and a sequence of points converging to the span of the eigenvectors associated with the extreme eigenvalues. The performance becomes very good if a short step is taken at every (say) ten iterations. We show a new method for estimating short steps, and propose a method alternating Cauchy and short steps. Finally, we use the roots of a certain Chebyshev polynomial to further accelerate the methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Stat. Math. Tokyo 11, 1–17 (1959)

    Article  MathSciNet  MATH  Google Scholar 

  2. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  3. Birgin, E.G., Martínez, J.M., Raydan, M.: Spectral projected gradient methods. In: Floudas, C.A., Pardalos, P.M. (eds.) Encyclopedia of Optimization, pp. 3652–3659. Springer, New York (2009)

    Chapter  Google Scholar 

  4. Cauchy, A.: Méthode générale pour la résolution des systèmes d’équations simultanées. Comp. Rend. Acad. Sci. Paris 25, 536–538 (1847)

    Google Scholar 

  5. Dai, Y.H.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  6. de Asmundis, R., di Serafino, D., Hager, W., Toraldo, G., Zhang, H.: An efficient gradient method using the Yuan steplength. Comput. Optim. Appl. 59, 541–563 (2014). doi:10.1007/s10589-014-9669-5

  7. de Asmundis, R., di Serafino, D., Riccio, R., Toraldo, G.: On spectral properties of steepest descent methods. IMA J. Numer. Anal. 33, 1416–1435 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  9. Forsythe, G.E.: On the asymptotic directions of the s-dimensional optimum gradient method. Numer. Math. 11, 57–76 (1968)

    Article  MathSciNet  MATH  Google Scholar 

  10. Gonzaga, C.C.: Optimal performance of the steepest descent algorithm for quadratic functions. Technical report, Federal University of Santa Catarina, Florianopolis, Brazil (2014)

  11. Nocedal, J., Sartenaer, A., Zhu, C.: On the behavior of the gradient norm in the steepest descent method. Comput. Optim. Appl. 22, 5–35 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  12. Raydan, M.: The Barzilai and Borwein gradient method for large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  13. Raydan, M., Svaiter, B.: Relaxed steepest descent and Cauchy-Barzilai-Borwein method. Comput. Optim. Appl. 21, 155–167 (2002)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

The author was partially supported by CNPq under Grant 308413/2009-1.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Clóvis C. Gonzaga.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gonzaga, C.C., Schneider, R.M. On the steepest descent algorithm for quadratic functions. Comput Optim Appl 63, 523–542 (2016). https://doi.org/10.1007/s10589-015-9775-z

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-015-9775-z

Keywords

Navigation