Abstract
Nonlinear conjugate gradient methods are among the most preferable and effortless methods to solve smooth optimization problems. Due to their clarity and low memory requirements, they are more desirable for solving large-scale smooth problems. Conjugate gradient methods make use of gradient and the previous direction information to determine the next search direction, and they require no numerical linear algebra. However, the utility of nonlinear conjugate gradient methods has not been widely employed in solving nonsmooth optimization problems. In this paper, a modified nonlinear conjugate gradient method, which achieves the global convergence property and numerical efficiency, is proposed to solve large-scale nonsmooth convex problems. The new method owns the search direction, which generates sufficient descent property and belongs to a trust region. Under some suitable conditions, the global convergence of the proposed algorithm is analyzed for nonsmooth convex problems. The numerical efficiency of the proposed algorithm is tested and compared with some existing methods on some large-scale nonsmooth academic test problems. The numerical results show that the new algorithm has a very good performance in solving large-scale nonsmooth problems.
Similar content being viewed by others
References
Zhang, H., Wu, Q., Nguyen, T., Sun, X.: Synthetic aperture radar image segmentation by modified student’s t-mixture model. IEEE Trans. Geosci. Remote Sens. 52, 4391–4403 (2014)
Li, J., Li, X., Yang, B., Sun, X.: Segmentation-based image copy-move forgery detection scheme. IEEE Trans. Inf. Forensics Secur. 10, 507–518 (2015)
Mäkelä, M.M.: Survey of bundle method for nonsmooth optimization. Optim. Math. Softw. 17, 1–29 (2002)
Chen, S.S., Donoho, D.L., Saunders, M.A.: Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20, 33–61 (1998)
Kärkkäinen, T., Majava, K., Mäkelä, M.M.: Comparison of formulations and solution methods for image restoration problems. Inverse Probl. 17, 1977–1995 (2001)
Li, Y., Yuan, G., Sheng, Z.: An active-set algorithm for solving large-scale nonsmooth optimization models with box constraints. PLoS One 13, 1–16 (2018)
Mäkelä, M.M., Neittaanmäkai, P.: Nonsmooth Optimization: Analysis and Algorithms with Applications to Optimal Control. World Scientific Pub. Co., Singapore (1992)
Lemaréchal, C., Sagastizábal, C.: Practical aspects of the Moreu–Yosida regularization: theoretical preliminaries. SIAM J. Optim. 7, 367–385 (1997)
Fukushima, M., Qi, L.: A global and superlinearly convergent algorithm for nonsmooth convex minimization. SIAM J. Optim. 6, 1106–1120 (1996)
Rockafellar, R.T.: Monotone operators and proximal point algorithm. SIAM J. Control Optim. 14, 877–898 (1976)
Birge, J.R., Qi, L., Wei, Z.: Convergence analysis of some methods for minimizing a nonsmooth convex function. J. Optim. Theory Appl. 97, 357–383 (1998)
Bonnans, J.F., Gilbert, J.C., Lemaréchal, C., Sagastizábal, C.A.: A family of variable metric proximal methods. Math. Program. 68, 15–47 (1995)
Wei, Z., Qi, L.: Convergence analysis of a proximal Newton method. Numer. Funct. Anal. Optim. 17, 463–472 (1996)
Yuan, G., Wei, Z.: The Barzilai and Browein gradient method with nonmonotone line search for nonsmooth convex optimization problems. Math. Model. Anal. 17, 203–216 (2012)
Sagara, N., Fukushima, M.: A trust region method for nonsmooth convex optimization. J. Ind. Manag. Optim. 1, 171–180 (2005)
Yuan, G., Wei, Z., Wang, Z.: Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization. Comput. Optim. Appl. 54, 45–64 (2013)
Kiwiel, K.C.: Proximity control in bundle methods for convex nondifferentiable optimization. Math. Program. 46, 105–122 (1990)
Luksan, L., Vlcek, J.: A bundle-Newton method for nonsmooth unconstrained minimization. Math. Program. 83, 373–391 (1998)
Kiwiel, K.C.: Methods of Descent for Nondifferentiable Optimization. Lecture Notes in Mathematics, vol. 1133. Springer, Berlin (1985)
Kiwiel, K.C.: Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities. Math. Program. 69, 89–109 (1995)
Haarala, M., Mäkelä, M.M.: New limited memory bundle method for large-scale nonsmooth optimization. Optim. Methods Softw. 19, 673–692 (2004)
Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18, 32–52 (2013)
Hestenes, M.R., Stiefel, S.L.: Methods of conjugate gradients for solving linear equations. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Polak, E., Ribière, G.: Note sur la convergence de directions conjugees. Rev. Fr. Inform. Rech. Opér. 3, 35–43 (1969)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Fletcher, R.: Practical Methods of Optimization: Unconstrained Optimization, vol. I, 2nd edn. Wiley, New York (1997)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Yuan, G., Wei, Z., Li, G.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255, 86–96 (2014)
Yuan, G., Meng, Z., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168, 129–152 (2015)
Yuan, G., Wei, Z.: A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems. J. Appl. Math. Comput. 51, 397–412 (2015)
Yuan, G., Sheng, Z., Liu, W.: The modified HZ conjugate gradient algorithm for large-scale nonsmooth optimization. PLoS One 11, 1–15 (2016)
Du, S., Chen, M.: A new smoothing modified three-term conjugate gradient method for \(l_1\)-norm minimization problem. J. Inequal. Appl. 105, 1–14 (2018)
Chen, M., Du, S.Q.: The smoothing FR conjugate gradient method for solving a kind of nonsmooth optimization problem with \(l_1\)-norm. Math. Probl. Eng. 2018, 1–9 (2018)
Cui, Z., Yuan, G., Sheng, Z., Liu, W., Wang, X., Duan, X.: A modified BFGS formula using a trust region model for nonsmooth convex minimizations. PLoS One 10, 1–15 (2015)
Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62, 261–273 (1993)
Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithm II. Springer, Berlin (1993)
Calamai, P.H., Moré, J.J.: Projected gradient methods for linear constrained problems. Math. Program. 39, 93–116 (1987)
Qi, L.: Convergence analysis of some algorithms for solving nonsmooth equations. Math. Oper. Res. 18, 227–245 (1993)
Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)
Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust Region Methods. SIAM, Philadelphia (2000)
Polak, E.: The conjugate gradient method in extreme problems. Comput. Math. Phys. 9, 94–112 (1969)
Fukushima, M.: A descent algorithm for nonsmooth convex optimization. Math. Program. 30, 163–175 (1984)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Acknowledgements
The authors would like to thank the reviewers and editor for their valuable comments, which greatly improve our paper. The authors are also very thankful to Dr. Abba Hagos Weldeslassie Fissuh for his assistance in editing this manuscript. This work is supported by the National Natural Science Foundation of China (Grant No. 11771003).
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Enrique Zuazua.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Woldu, T.G., Zhang, H., Zhang, X. et al. A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization. J Optim Theory Appl 185, 223–238 (2020). https://doi.org/10.1007/s10957-020-01636-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-020-01636-7
Keywords
- Conjugate gradient method
- Moreau–Yosida regularization
- Nonsmooth large-scale problems
- Global convergence