Abstract
At present, many conjugate gradient methods with global convergence have been proposed in unconstrained optimization, such as MPRP algorithm proposed by Zhang et al. (IMA J. Numer. Anal. 26(4):629–640, 2006). Unfortunately, almost all of these methods require gradient Lipschitz continuity condition. As far as we know, how do the current conjugate gradient methods deal with gradient non-Lipschitz continuity problems is basically blank. For gradient non-Lipschitz continuity problems, Algorithm 1 and Algorithm 2 are proposed in this paper based on MPRP algorithm. The proposed algorithms have the following characteristics: (i) Algorithm 1 retains sufficient descent property independent of line search technology in MPRP algorithm; (ii) for nonconvex and gradient non-Lipschitz continuous functions, the global convergence of Algorithm 1 is obtained in combination with the trust region property and the weak Wolfe-Powell line search technique; (iii) based on Algorithm 1, Algorithm 2 is further improved which global convergence can be obtained independently of line search technique; (iv) according to numerical experiments, the proposed algorithms perform competitively with other similar algorithms.
Similar content being viewed by others
References
Al-Baali, M.: Descent property and global convergence of the Fletcher-Reeves method with inexact line search[J]. IMA J. Numer. Anal. 5(1), 121–124 (1985)
Dai, Y.: Analysis of conjugate gradient methods, Ph.D. Thesis. Institute of Computational Mathe Matics and Scientifific/Engineering Computing. Chese Academy of Sciences (1997)
Dai, Y.: Convergence properties of the BFGS algoritm[J]. SIAM J. Optim. 13(3), 693–701 (2002)
Dai, Y, Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property[J]. SIAM J. Optim. 10(1), 177–182 (1999)
Fletcher, R.: Practical Methods of Optimization, 2nd. Wiley, New York (1987)
Fletcher, R, Reeves, C.: Function minimization by conjugate gradients[J]. Comput. J. 7(2), 149–154 (1964)
Gilbert, J, Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization[J]. SIAM J. Optim. 2(1), 21–42 (1992)
Goldstein, A.A.: On steepest descent[J]. J. Soc. Indus. Appl. Math. Series A: Control 3(1), 147–151 (1965)
Hestenes, M, Stiefel, E.: Method of conjugate gradient for solving linear equations. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Hager, W, Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search[J]. SIAM J. Optim. 16(1), 170–192 (2005)
Hager, W, Zhang H.: Algorithm 851: CG-DESCENT, a conjugate gradient method with guaranteed descent[J]. ACM Trans. Math. Softw. (TOMS) 32(1), 113–137 (2006)
Liu, J, Lu, Z, Xu, J, Wu, S, Tu, Z: An efficient projection-based algorithm without Lipschitz continuity for large-scale nonlinear pseudo-monotone equations[J]. Journal of Computational and Applied Mathematics, 113822. https://doi.org/10.1016/j.cam.2021.113822 (2021)
Levenberg, K.: A method for the solution of certain non-linear problems in least squares[J]. Quart. Appl. Math. 2(2), 164–168 (1944)
Li, X, Wang, S, Jin, Z, et al: A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models[J]. Math. Probl. Eng., 2018 (2018)
Liu, Y, Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: Theory[J]. J. Optim. Theory Appl. 69(1), 129–137 (1991)
Polyak, B.: The conjugate gradient method in extremal problems[J]. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Polak, E, Ribiere, G.: Note sur la convergence de mèthodes de directions conjuguèes[J]. ESAIM: Mathematical Modelling and Numerical Analysis-Modèlisation MathématiquŸe et Analyse Numérique 3(R1), 35–43 (1969)
Powell, M.: Convergence properties of algorithms for nonlinear optimization[J]. SIAM Rev. 28(4), 487–500 (1986)
Powell, M.: Nonconvex Minimization Calculations and the Conjugate Gradient Method[M]. Springer, Berlin (1984)
Powell, M.J.D.: Convergence Properties of a Class of Minimization Algorithms[M]//Nonlinear Programming 2. Academic Press, pp. 1–27 (1975)
Sheng, Z, Ouyang, A, Liu, L., et al.: A novel parameter estimation method for Muskingum model using new Newton-type trust region algorithm[J]. Math. Probl. Eng., 2014 (2014)
Sheng, Z, Yuan, G.: An effective adaptive trust region algorithm for nonsmooth minimization[J]. Comput. Optim. Appl. 71(1), 251–271 (2018)
Sheng, Z, Yuan, G, Cui, Z, et al.: An adaptive trust region algorithm for large-residual nonsmooth least squares problems[J]. J. Industr. Manag. Optim. 14(2), 707 (2018)
Sheng, Z, Gonglin, Y, Zengru, C.U.I.: A new adaptive trust region algorithm for optimization problems[J]. Acta Math. Sci. 38(2), 479–496 (2018)
Wolfe, P.: Convergence conditions for ascent methods[J]. SIAM Rev. 11(2), 226–235 (1969)
Wolfe, P.: Convergence conditions for ascent methods. II Some corrections[J]. SIAM Rev. 13(2), 185–188 (1971)
Wei, Z, Yao, S, Liu, L.: The convergence properties of some new conjugate gradient methods[J]. Appl. Math. Comput. 183(2), 1341–1350 (2006)
Yuan, G, Li, T, Hu, W.: A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems[J]. Appl. Numer. Math. 147, 129–141 (2020)
Yuan, G, Lu, X.: A modified PRP conjugate gradient method[J]. Ann. Oper. Res. 166(1), 73–90 (2009)
Yuan, G, Lu, J, Wang, Z.: The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems[J]. Appl. Numer. Math. 152, 1–11 (2020)
Yuan, G, Lu, J, Wang, Z.: The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems[J]. Soft. Comput. 25(8), 5867–5879 (2021)
Yuan, G, Lu, X, Wei, Z.: A conjugate gradient method with descent direction for unconstrained optimization[J]. J. Comput. Appl. Math. 233(2), 519–530 (2009)
Yuan, G, Lu, S, Wei, Z.: A new trust-region method with line search for solving symmetric nonlinear equations[J]. Int. J. Comput. Math. 88(10), 2109–2123 (2011)
Yuan, G, Lu, X, Wei, Z.: BFGS trust-region method for symmetric nonlinear equations[J]. J. Comput. Appl. Math. 230(1), 44–58 (2009)
Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)
Yuan, G, Meng, Z, Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations[J]. J. Optim. Theory Appl. 168(1), 129–152 (2016)
Yuan, G, Zhang, M, Zhou, Y: Adaptive scaling damped BFGS method without gradient Lipschitz continuity. [J] 124, 107634 (2022). https://doi.org/10.1016/j.aml.2021.107634
Yuan, G, Wei, Z, Li, G.: A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs[J]. J. Comput. Appl. Math. 255, 86–96 (2014)
Yuan, G, Wei, Z, Yang, Y.: The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions[J]. J. Comput. Appl. Math. 362, 262–275 (2019)
Yuan, G, Zhang, M.: A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations[J]. J. Comput. Appl. Math. 286, 186–195 (2015)
Yuan, Y.: Analysis on the conjugate gradient method[J]. Dyn. Syst. 2(1), 19–29 (1993)
Zoutendijk, G.: Nonlinear programming, computational methods[J]. Integer & Nonlinear Programming, 37–86 (1970)
Zhang, L, Zhou, W, Li, D.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence[J]. IMA J. Numer. Anal. 26(4), 629–640 (2006)
Funding
This work was supported by the National Natural Science Foundation of China (Grant No. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education (Grant No. [2019]52), the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046), the Special Funds for Local Science and Technology Development Guided by the Central Government (No. ZY20198003), and Special Foundation for Guangxi Ba Gui Scholars.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Yuan, G., Yang, H. & Zhang, M. Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions. Numer Algor 91, 145–160 (2022). https://doi.org/10.1007/s11075-022-01257-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-022-01257-3
Keywords
- Conjugate gradient
- Nonconvex functions
- Sufficient descent property
- Global convergence
- Gradient Lipschitz continuity condition