Abstract
A three-term conjugate gradient algorithm for large-scale unconstrained optimization using subspace minimizing technique is presented. In this algorithm the search directions are computed by minimizing the quadratic approximation of the objective function in a subspace spanned by the vectors: −g k+1, s k and y k . The search direction is considered as: d k+1 = −g k+1 + a k s k + b k y k , where the scalars a k and b k are determined by minimization the affine quadratic approximate of the objective function. The step-lengths are determined by the Wolfe line search conditions. We prove that the search directions are descent and satisfy the Dai-Liao conjugacy condition. The suggested algorithm is of three-term conjugate gradient type, for which both the descent and the conjugacy conditions are guaranteed. It is shown that, for uniformly convex functions, the directions generated by the algorithm are bounded above, i.e. the algorithm is convergent. The numerical experiments, for a set of 750 unconstrained optimization test problems, show that this new algorithm substantially outperforms the known Hestenes and Stiefel, Dai and Liao, Dai and Yuan and Polak, Ribiére and Poliak conjugate gradient algorithms, as well as the limited memory quasi-Newton method L-BFGS and the discrete truncated-Newton method TN.
Similar content being viewed by others
References
Al-Bayati, A.Y., Sharif, W.H.: A new three-term conjugate gradient method for unconstrained optimization. Can. J. Sci. Eng. Math. 1(5), 108–124 (2010)
Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algoritm. 42, 63–73 (2006)
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)
Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213, 361–369 (2009)
Andrei, N.: A modified Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization. Optim. 60, 1457–1471 (2011)
Beale, E.M.L.: A derivative of conjugate gradients. In: Lootsma, F.A. (ed.) Numerical Methods for Nonlinear Optimization, pp 39–43. Academic Press, London (1972)
Cheng, W.: A two-term PRP-based descent method. Numer. Funct. Anal. Optim 28, 1217–1230 (2007)
Con, A.R., Gould, N., Sartenaer, A., Toint, Ph.L.: On iterated-subspace minimization methods for nonlinear optimization. In: Adams, L., Nazareth, J.L. (eds.) Linear and Nonlinear Conjugate Gradient related methods, SIAM, pp. 50–78 (1996)
Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Deng, N.Y., Li, Z.: Global convergence of three terms conjugate gradient methods. Optim. Method Softw. 4, 273–282 (1995)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R.: Practical Methods of OptimizationUnconstrained Optimization, Vol. 1. Wiley, New York (1987)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Gilbert, J.C., Lemaréchal, C.: Some numerical experiments with variable storage quasi-Newton algorithm. Math. Program. 45, 407–435 (1989)
Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)
Liu, D.C., Nocedal, J.: On the limited BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, Part 1: Theory. J. Optim. Theor. Appl. 69, 129–137 (1991)
Moré, J.J., Thuente, D.J.: On linesearch algorithms with guaranteed sufficient decrease. Mathematics and Computer Science Division Preprint MCS-P153-0590. Argonne National Laboratory. Argonne, IL (1990)
Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)
Nash, S.G.: User’s guide for TN-TNBC: Fortran routines for nonlinear optimization. Report 397, Mathematical Sciences Department, The John Hopkins University, Baltimore, MD
Nash, S.G., Nocedal, J.: A numerical study of the limited memory BFGS method and the truncated-Newton method for large scale optimization. SIAM J. Optim. 1, 358–372 (1991)
Nazareth, L.: A conjugate direction algorithm without line search. J. Optim. Theor. Appl. 23, 373–387 (1977)
Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comp. 35, 773–782 (1980)
Nocedal, J.: Conjugate gradient methods and nonlinear optimization. In: Adams, L., Nazareth, J.L. (eds.) Linear and Nonlinear Conjugate Gradient related methods, SIAM, pp. 9–23 (1996)
Polak, E., Ribière, G.: Note sur la convergence de directions conjuguée. Rev. Francaise Informat Recherche Operationelle, 3e Année, vol. 16, pp. 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969)
Powell, M.J.D.: Nonconvex Minimization Calculations and the Conjugate Gradient Method. Numerical Analysis (Dundee, 1983), Lecture Notes in Mathematics, vol. 1066, pp. 122–141, Springer, Berlin (1984)
Stanimirović, P.S., Miladinović, M.B.: Accelerated gradient descent methods with line search. Numer. Algoritm. 54, 503–520 (2010)
Stoer, J., Yuan, Y.X.: A subspace study on conjugate gradient algorithms. ZAMM. Z. Angew. Math. Mech. 75, 69–77 (1995)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev 11, 226–235 (1968)
Wolfe, P.: Convergence conditions for ascent methods, (II): some corrections. SIAM Rev 13, 185–188 (1971)
Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)
Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)
Zhang, J., Xiao, Y., Wei, Z.: Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization. Math. Probl. Eng. (2009). Article ID 243290 doi:10.1155/2009/243290
Zhang, L., Zhou, Y.: A note on the convergence properties of the original three-term Hestenes-Stiefel method. AMO – Adv. Model. Optim. 14, 159–163 (2012)
Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp 38–86. North-Holland, Amsterdam (1970)
Author information
Authors and Affiliations
Corresponding author
Additional information
Dr. Neculai Andrei is member of Academy of Romanian Scientists, Splaiul Independenţei Nr. 54, Sector 5, Bucharest, Romania
Rights and permissions
About this article
Cite this article
Andrei, N. An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer Algor 65, 859–874 (2014). https://doi.org/10.1007/s11075-013-9718-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-013-9718-7