Abstract
Based on an eigenvalue analysis, condition number of the scaled memoryless BFGS (Broyden–Fletcher–Goldfarb–Shanno) updating formula is obtained. Then, a modified scaling parameter is proposed for the mentioned updating formula, minimizing the given condition number. The suggested scaling parameter can be considered as a modified version of the self–scaling parameter proposed by Oren and Spedicato. Numerical experiments are done; they demonstrate practical effectiveness of the proposed scaling parameter.
Similar content being viewed by others
References
Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)
Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)
Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)
Andrei, N.: A scaled nonlinear conjugate gradient algorithm for unconstrained optimization. Optimization 57(4), 549–570 (2008)
Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. European J. Oper. Res. 204(3), 410–420 (2010)
Andrei, N.: Open problems in conjugate gradient algorithms for unconstrained optimization. B. Malays. Math. Sci. So. 34(2), 319–330 (2011)
Babaie–Kafaki, S.: A modified BFGS algorithm based on a hybrid secant equation. Sci. China Math. 54(9), 2019–2036 (2011)
Babaie–Kafaki, S.: A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei. Comput. Optim. Appl. 52(2), 409–414 (2012)
Babaie–Kafaki, S.: A quadratic hybridization of Polak–Ribière–Polyak and Fletcher–Reeves conjugate gradient methods. J. Optim. Theory Appl. 154(3), 916–932 (2012)
Babaie–Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4), 361–374 (2013)
Babaie–Kafaki, S.: A new proof for the sufficient descent condition of Andrei’s scaled conjugate gradient algorithms. Pac. J. Optim. 9(1), 23–28 (2013)
Babaie–Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261(5), 172–182 (2014)
Babaie–Kafaki, S.: On optimality of the parameters of self–scaling memoryless quasi–Newton updating formulae. J. Optim. Theory Appl. (2015). doi:10.1007/s10957-015-0724-x
Babaie–Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belg. Math. Soc. Simon Stevin 21(3), 465–477 (2014)
Babaie–Kafaki, S., Ghanbari, R., Mahdavi–Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234 (5), 1374–1386 (2010)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dolan, E.D., Moré, J.J., Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)
Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Kou, C.X., Dai, Y.H.: A modified self–scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165(1), 209–224 (2015)
Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)
Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202(2), 523–539 (2007)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)
Oren, S.S.: Self–scaling variable metric (SSVM) algorithms. II. Implementation and experiments. Management Sci. 20(5), 863–874 (1974)
Oren, S.S., Luenberger, D.G.: Self–scaling variable metric (SSVM) algorithms. I. Criteria and sufficient conditions for scaling a class of algorithms. Management Sci. 20(5), 845–862 (1973/74)
Oren, S.S., Spedicato, E.: Optimal conditioning of self–scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)
Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)
Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)
Wei, Z., Li, G., Qi, L.: New quasi–Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175(2), 1156–1188 (2006)
Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)
Yuan, Y.X., Byrd, R.H.: Non–quasi–Newton updates for unconstrained optimization. J. Comput. Math. 13(2), 95–107 (1995)
Zhang, J., Xu, C.: Properties and numerical performance of quasi–Newton methods with modified quasi–Newton equations. J. Comput. Appl. Math. 137(2), 269–278 (2001)
Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi–Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999)
Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Babaie–Kafaki, S. A modified scaling parameter for the memoryless BFGS updating formula. Numer Algor 72, 425–433 (2016). https://doi.org/10.1007/s11075-015-0053-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-015-0053-z