Skip to main content
Log in

A modified scaling parameter for the memoryless BFGS updating formula

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

Based on an eigenvalue analysis, condition number of the scaled memoryless BFGS (Broyden–Fletcher–Goldfarb–Shanno) updating formula is obtained. Then, a modified scaling parameter is proposed for the mentioned updating formula, minimizing the given condition number. The suggested scaling parameter can be considered as a modified version of the self–scaling parameter proposed by Oren and Spedicato. Numerical experiments are done; they demonstrate practical effectiveness of the proposed scaling parameter.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  2. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  3. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  4. Andrei, N.: A scaled nonlinear conjugate gradient algorithm for unconstrained optimization. Optimization 57(4), 549–570 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  5. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. European J. Oper. Res. 204(3), 410–420 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  6. Andrei, N.: Open problems in conjugate gradient algorithms for unconstrained optimization. B. Malays. Math. Sci. So. 34(2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  7. Babaie–Kafaki, S.: A modified BFGS algorithm based on a hybrid secant equation. Sci. China Math. 54(9), 2019–2036 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  8. Babaie–Kafaki, S.: A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei. Comput. Optim. Appl. 52(2), 409–414 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  9. Babaie–Kafaki, S.: A quadratic hybridization of Polak–Ribière–Polyak and Fletcher–Reeves conjugate gradient methods. J. Optim. Theory Appl. 154(3), 916–932 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  10. Babaie–Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4), 361–374 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  11. Babaie–Kafaki, S.: A new proof for the sufficient descent condition of Andrei’s scaled conjugate gradient algorithms. Pac. J. Optim. 9(1), 23–28 (2013)

    MathSciNet  MATH  Google Scholar 

  12. Babaie–Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261(5), 172–182 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  13. Babaie–Kafaki, S.: On optimality of the parameters of self–scaling memoryless quasi–Newton updating formulae. J. Optim. Theory Appl. (2015). doi:10.1007/s10957-015-0724-x

    MathSciNet  MATH  Google Scholar 

  14. Babaie–Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belg. Math. Soc. Simon Stevin 21(3), 465–477 (2014)

    MathSciNet  MATH  Google Scholar 

  15. Babaie–Kafaki, S., Ghanbari, R., Mahdavi–Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234 (5), 1374–1386 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  16. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  17. Dolan, E.D., Moré, J.J., Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  18. Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  19. Kou, C.X., Dai, Y.H.: A modified self–scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165(1), 209–224 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  20. Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  21. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202(2), 523–539 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  22. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)

    MATH  Google Scholar 

  23. Oren, S.S.: Self–scaling variable metric (SSVM) algorithms. II. Implementation and experiments. Management Sci. 20(5), 863–874 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  24. Oren, S.S., Luenberger, D.G.: Self–scaling variable metric (SSVM) algorithms. I. Criteria and sufficient conditions for scaling a class of algorithms. Management Sci. 20(5), 845–862 (1973/74)

  25. Oren, S.S., Spedicato, E.: Optimal conditioning of self–scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  26. Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)

    MATH  Google Scholar 

  27. Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)

    Book  MATH  Google Scholar 

  28. Wei, Z., Li, G., Qi, L.: New quasi–Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175(2), 1156–1188 (2006)

    MathSciNet  MATH  Google Scholar 

  29. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  30. Yuan, Y.X., Byrd, R.H.: Non–quasi–Newton updates for unconstrained optimization. J. Comput. Math. 13(2), 95–107 (1995)

    MathSciNet  MATH  Google Scholar 

  31. Zhang, J., Xu, C.: Properties and numerical performance of quasi–Newton methods with modified quasi–Newton equations. J. Comput. Appl. Math. 137(2), 269–278 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  32. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi–Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  33. Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie–Kafaki.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Babaie–Kafaki, S. A modified scaling parameter for the memoryless BFGS updating formula. Numer Algor 72, 425–433 (2016). https://doi.org/10.1007/s11075-015-0053-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-015-0053-z

Keywords

Mathematics Subject Classification (2010)

Navigation