Abstract
Techniques for obtaining safely positive definite Hessian approximations with self-scaling and modified quasi-Newton updates are combined to obtain ‘better’ curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method, has the global and superlinear convergence for convex functions. Numerical experiments with this class, using the well-known quasi-Newton BFGS, DFP and a modified SR1 updates, are presented to illustrate some advantages of the new techniques. These experiments show that the performance of several combined methods are substantially better than that of the standard BFGS method. Similar improvements are also obtained if the simple sufficient function reduction condition on the steplength is used instead of the strong Wolfe conditions.
Similar content being viewed by others
References
Al-Baali, M.: Variational quasi-Newton methods for unconstrained optimization. J. Optim. Theory Appl. 77, 127–143 (1993)
Al-Baali, M.: Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches for convex functions. Comput. Optim. Appl. 9, 191–203 (1998)
Al-Baali, M.: Quasi-Wolfe conditions for quasi-Newton methods for large-scale optimization. Presented at 40th Workshop on Large Scale Nonlinear Optimization, Erice, Italy, 22 June–1 July 2004
Al-Baali, M.: Convergence properties of damped Broyden’s family of quasi-Newton methods. Research Report DOMAS 09/1, Sultan Qaboos University, Oman (2009)
Al-Baali, M., Grandinetti, L.: On practical modifications of the quasi-Newton BFGS method. Adv. Model. Optim. 11, 63–76 (2009)
Al-Baali, M., Khalfan, H.: A wide interval for efficient self-scaling quasi-Newton algorithms. Optim. Methods Softw. 20, 679–691 (2005)
Al-Baali, M., Khalfan, H.: On modified self-scaling quasi-Newton methods. Research Report DOMAS 08/1, Sultan Qaboos University, Oman (2008)
Byrd, R.H., Nocedal, J., Yuan, Y.: Global convergence of a class of quasi-Newton methods on convex problems. SIAM J. Numer. Anal. 24, 1171–1190 (1987)
Byrd, R.H., Liu, D.C., Nocedal, J.: On the behavior of Broyden’s class of quasi-Newton methods. SIAM J. Optim. 2, 533–557 (1992)
Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Testing a class of algorithms for solving minimization problems with simple bound on the variables. Math. Comput. 50, 399–430 (1988)
Dennis, J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. SIAM, Philadelphia (1996)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R.: Practical Methods of Optimization, 2nd edn. Wiley, Chichester (1987). Reprinted in 2000
Fletcher, R., Powell, M.J.D.: A rapidly convergent descent method for minimization. Comput. J. 6, 163–168 (1963)
Gill, P.E., Leonard, M.W.: Limited-memory reduced-Hessian methods for large-scale unconstrained optimization. SIAM J. Optim. 14, 380–401 (2003)
Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr and SifDec: A constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29, 373–394 (2003)
Grandinetti, L.: Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function. J. Optim. Theory Appl. 43, 1–21 (1984)
Li, D., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)
Li, D., Qi, L., Roshchina, V.: A new class of quasi-Newton updating formulas. Optim. Methods Softw. 23, 237–249 (2008)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Lukšan, L., Spedicato, E.: Variable metric methods for unconstrained optimization and nonlinear least squares. J. Comput. Appl. Math. 124, 61–95 (2000)
Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7, 17–41 (1981)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, London (1999)
Oren, S.S., Spedicato, E.: Optimal conditioning of self-scaling variable metric algorithms. Math. Program. 10, 70–90 (1976)
Powell, M.J.D.: Some global convergence properties of a variable metric algorithm for minimization without exact line searches. In: Cottle, R.W., Lemke, C.E. (eds.) Nonlinear Programming. SIAM-AMS Proceedings, vol. IX. SIAM, Philadelphia (1976)
Powell, M.J.D.: Algorithms for nonlinear constraints that use Lagrange functions. Math. Program. 14, 224–248 (1978)
Powell, M.J.D.: How bad are the BFGS and DFP methods when the objective function is quadratic? Math. Program. 34, 34–47 (1986)
Wei, Z., Yu, G., Yuan, G., Lian, Z.: The superlinear convergence of a modified BFGS-type method for unconstrained optimization. Comput. Optim. Appl. 29, 315–332 (2004)
Xu, C., Zhang, J.: A survey of quasi-Newton equations and quasi-Newton methods for optimization. Ann. Oper. Res. 103, 213–234 (2001)
Yabe, H., Ogasawara, H., Yoshino, M.: Local and superlinear convergence of quasi-Newton methods based on modified secant conditions. J. Comput. Appl. Math. 205, 617–632 (2007)
Yuan, Y.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11, 325–332 (1991)
Zhang, J.Z., Xu, C.X.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137, 269–278 (2001)
Zhang, Y., Tewarson, R.P.: Quasi-Newton algorithms with updates from the preconvex part of Broyden’s family. IMA J. Numer. Anal. 8, 487–509 (1988)
Zhang, J.Z., Deng, N.Y., Chen, L.H.: Quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Al-Baali, M., Khalfan, H. A combined class of self-scaling and modified quasi-Newton methods. Comput Optim Appl 52, 393–408 (2012). https://doi.org/10.1007/s10589-011-9415-1
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-011-9415-1