Skip to main content
Log in

A new adaptive Barzilai and Borwein method for unconstrained optimization

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this paper we view the Barzilai and Borwein (BB) method from a new angle, and present a new adaptive Barzilai and Borwein (NABB) method with a nonmonotone line search for general unconstrained optimization. In the proposed method, the scalar approximation to the Hessian matrix is updated by the Broyden class formula to generate an adaptive stepsize. It is remarkable that the new stepsize is chosen adaptively in the interval which contains the two well-known BB stepsizes. Moreover, for the negative curvature direction, a strategy for the choice of the stepsize is designed to accelerate the convergence rate of the NABB method. Furthermore, we apply the NABB method without any line search to strictly convex quadratic minimization. The numerical experiments show the NABB method is very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  2. Cauchy, A.: Méthode générale pour la résolution des systéms déquations simultanées. Comp. Rend. Sci. Paris 25, 46–89 (1847)

    Google Scholar 

  3. Birgin, E.G., Martínez, J.M., Raydan, M.: Spectral projected gradient methods: review and perspectives. J. Stat. Softw. 60(3), 1–21 (2014)

    Article  Google Scholar 

  4. Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  5. Dai, Y.H., Liao, L.Z.: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  6. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  7. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dai, Y.H., Hager, W.W., Schittkowski, K., et al.: The cyclic Barzilai–Borwein method for unconstrained optimization. IMA J. Numer. Anal. 26(3), 604–627 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  9. Glunt, W., Hayden, T.L., Raydan, M.: Molecular conformations from distance matrices. J. Comput. Chem. 14(1), 114–120 (1993)

    Article  Google Scholar 

  10. Dai, Y.H.: A new analysis on the Barzilai-Borwein gradient method. J. Oper. Res. Soc. China 1(2), 187–198 (2013)

    Article  MATH  Google Scholar 

  11. Zhou, B., Gao, L., Dai, Y.H.: Gradient methods with adaptive stepsizes. Comput. Optim. Appl. 35(1), 69–86 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  12. Biglari, F., Solimanpur, M.: Scaling on the spectral gradient method. J. Optim. Theory Appl. 158(2), 626–635 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization. Comput. Optim. Appl. 22, 103–109 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  14. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  15. Wei, Z.X., Li, G.Y., Qi, L.Q.: New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175, 1156–1188 (2006)

    MathSciNet  MATH  Google Scholar 

  16. Xiao, Y.H., Wang, Q.Y., Wang, D., et al.: Notes on the Dai-Yuan-Yuan modified spectral gradient method. J. Comput. Appl. Math. 234(10), 2986–2992 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  17. Biglari, F., Hassan, M.A., Leong, W.J.: New quasi-Newton methods via higher order tensor models. J. Comput. Appl. Math. 235(8), 2412–2422 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  18. Dennis, J.E., Wolkowicz, H.: Sizing and least change secant methods. SIAM J. Numer. Anal. 30, 1291–1313 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  19. Yuan, Y.X., Sun, W.Y.: Theory and Methods of Optimization. Science Press of China, Beijing (1999)

    Google Scholar 

  20. Watkins, D.S.: Fundamentals of matrix computations. Phys. Plasmas 59, 199 (2002)

    MathSciNet  MATH  Google Scholar 

  21. Luengo, F., Raydan, M.: Gradient method with dynamical retards for large-scale optimization problems. Electron. Trans. Numer. Anal. ETNA 16(539), 186–193 (2003)

    MathSciNet  MATH  Google Scholar 

  22. Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  23. Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods for convex sets. SIAM J. Optim. 10(4), 1196–1211 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  24. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  25. Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  26. Dai, Y.H.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2010)

    MathSciNet  MATH  Google Scholar 

  27. Yuan, Y.X.: A new stepsize for the steepest descent method. J. Comput. Math. 24, 149–156 (2006)

    MathSciNet  MATH  Google Scholar 

  28. Asmundis, R.D., Serafino, D.D., Riccio, F., et al.: On spectral properties of steepest descent methods. IMA J. Numer. Anal. 33(4), 1416–1435 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  29. Friedlander, A., Martinez, J.M., Molina, B., et al.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36, 275–289 (1999)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We would like to thank Professor Dai, Y. H. and Dr. Cou Caixia for their Fortran code for the BB method. We also wish to thank Dr. Huang yakui for his careful reading and helpful comments, and thank two anonymous referees for their kind and valuable comments. These comments will help to improve the quality of this paper. This research is supported by National Science Foundation of China (Nos. 11461021, 11601012), Guangxi Science Foundation (Nos. 2014GXNSFAA118028, 2015GXNSFAA139011), Scientific Research Foundation of Guangxi Education Department (No. 2013YB236), Scientific Research Project of Hezhou University (Nos. 2014YBZK06, 2016HZXYSX03). Guangxi Colleges and Universities Key Laboratory of Symbolic Computation and Engineering Data Processing.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zexian Liu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, H., Liu, Z. & Dong, X. A new adaptive Barzilai and Borwein method for unconstrained optimization. Optim Lett 12, 845–873 (2018). https://doi.org/10.1007/s11590-017-1150-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-017-1150-9

Keywords

Navigation