Abstract
A new conjugate gradient method is proposed by applying Powell’s symmetrical technique to conjugate gradient methods in this paper. Using Wolfe line searches, the global convergence of the method is analyzed by using the spectral analysis of the conjugate gradient iteration matrix and Zoutendijk’s condition. Based on this, some concrete descent algorithms are developed. 200s numerical experiments are presented to verify their performance and the numerical results show that these algorithms are competitive compared with the PRP+ algorithm. Finally, a brief discussion of the new proposed method is given.
Similar content being viewed by others
References
Al-Baali, M.: Descent property and global convergence of the Fletcher-Reeves method with inexact line-search. IMA J. Numer. Anal. 5, 121–124 (1985)
Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38, 401–416 (2007). doi:10.1007/s10589-007-9055-7
Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47, 143–156 (2008). doi10.1007/s11075-007-9152-9
Barzilai, J., Borwein, J.M.: Two point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)
Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)
Buckley, A.: Extending the relationship between the conjugate gradient and BFGS algorithms. Math. Program. 15, 343–348 (1978)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. Ser. A 91, 201–213 (2002)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49(6), 409–439 (1952)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)
Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, Berlin (2006)
Perry, A.: A modified conjugate gradient algorithm. Oper. Res. Tech. Notes 26(6), 1073–1078 (1978)
Powell, M.J.D.: A new algorithm for unconstrained optimization. In: Rosen, J.B., Mangasarian, O.L., Ritter, K. (eds.) Nonlinear Programming, pp. 31–66. Academic Press, New York (1970)
Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)
Shanno, D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978)
Shanno, D.F., Phua, K.H.: Algorithm 500, minimization of unconstrained multivariate functions. ACM Trans. Math. Softw. 2, 87–94 (1976)
Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64(2), 379–397 (1990)
Wei, Z.X., Li, G.Y., Qi, L.Q.: New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems. Appl. Math. Comput. 179, 407–430 (2006)
Yu, G.H., Zhao, Y.L., Wei, Z.X.: A descent nonlinear conjugate gradient method for large-scale unconstrained optimization. Appl. Math. Comput. 187, 636–643 (2007)
Author information
Authors and Affiliations
Corresponding author
Additional information
This research is supported by the Natural Science Foundation of China grant NSFC-60874034.
Rights and permissions
About this article
Cite this article
Dongyi, L., Genqi, X. Applying Powell’s symmetrical technique to conjugate gradient methods. Comput Optim Appl 49, 319–334 (2011). https://doi.org/10.1007/s10589-009-9302-1
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-009-9302-1