Abstract
In this work, we propose a new conjugate gradient method which consists of a modification of Perry’s method and ensures sufficient descent independent of the accuracy of the line search. An important property of our proposed method is that it achieves a high-order accuracy in approximating the second order curvature information of the objective function by utilizing a new modified secant condition. Moreover, we establish that the proposed method is globally convergent for general functions provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate that our proposed method is preferable and in general superior to classical conjugate gradient methods in terms of efficiency and robustness.




Similar content being viewed by others
References
Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38, 401–416 (2007)
Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)
Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 22, 1–17 (2014)
Babaie-Kafaki, S.: A modified BFGS algorithm based on a hybrid secant equation. Sci. China Math. 54(9), 2019–2036 (2011)
Babaie-Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261, 172–182 (2014)
Babaie-Kafaki, S., Ghanbari, R.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234, 1374–1386 (2010)
Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer. Algorithms 58(3), 315–331 (2011)
Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation. Optimization, 63(7), 1–16 (2012)
Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)
Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (1999)
Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21(1), 123–160 (1995)
Chen, W., Liu, Q.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient with a strong global convergence properties. SIAM J. Optim. 10, 177–182 (1999)
Dai, Y.H., Yuan, Y.X.: Nonlinear conjugate gradient methods. Shanghai Scientific and Technical Publishers, Shanghai (2000)
Dai, Z., Wen, F.: A modified CG-DESCENT method for unconstrained optimization. J. Comput. Appl. Math. 235(11), 3332–3341 (2011)
Dai, Z.F., Tian, B.S.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 5(4), 1–16 (2010)
Dolan, E., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Du, S.Q., Chen, Y.Y.: Global convergence of a modified spectral FR conjugate gradient method. Appl. Math. Comput. 202(2), 766–770 (2008)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40, 191–216 (2008)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23(4), 2150–2168 (2005)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods for conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J.Comput. Appl. Math. 129, 15–35 (2001)
Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)
Livieris, I.E., Pintelas, P.: Globally convergent modified Perry conjugate gradient method. Appl. Math. Comput. 218(18), 9197–9207 (2012)
Livieris, I.E., Pintelas, P.: A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization. J. Comput. Appl. Math. 239, 396–405 (2013)
Lu, A., Liu, H., Zheng, X., Cong, W.: A variant spectral-type FR conjugate gradient method and its global convergence. Appl. Math. Comput. 217(12), 5547–5552 (2011)
Narushima, Y., Yabe, H.: Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization. J. Comput. Appl. Math. 236(17), 4303–4317 (2012)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)
Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26, 1073–1078 (1978)
Polak, E., Ribière, G.: Note sur la convergence de methods de directions conjuguees. Revue Francais d’Informatique et de Recherche Operationnelle 16, 35–43 (1969)
Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)
Shanno, D.F., Phua, K.H.: Minimization of unconstrained multivariate functions. ACM Trans. Math. Softw. 2, 87–94 (1976)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)
Yu, G., Guan, L., Chen, W.: Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization. Optim. Methods Softw. 23(2), 275–293 (2008)
Yu, G.H.: Nonlinear self-scaling conjugate gradient methods for large-scale optimization problems. PhD thesis, Sun Yat-Sen University (2007)
Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)
Zhang, J.Z., Xu, C.X.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137, 269–278 (2001)
Zhang, L.: Two modified Dai-Yuan nonlinear conjugate gradient methods. Numer. Algorithms 50, 1–16 (2009)
Zhang, L., Zhou, W.: Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 216, 251–264 (2008)
Zhang, L., Zhou, W., Li, D.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)
Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Livieris, I.E., Pintelas, P. A modified Perry conjugate gradient method and its global convergence. Optim Lett 9, 999–1015 (2015). https://doi.org/10.1007/s11590-014-0820-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-014-0820-0