Skip to main content
Log in

A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

Nonlinear conjugate gradient methods are among the most preferable and effortless methods to solve smooth optimization problems. Due to their clarity and low memory requirements, they are more desirable for solving large-scale smooth problems. Conjugate gradient methods make use of gradient and the previous direction information to determine the next search direction, and they require no numerical linear algebra. However, the utility of nonlinear conjugate gradient methods has not been widely employed in solving nonsmooth optimization problems. In this paper, a modified nonlinear conjugate gradient method, which achieves the global convergence property and numerical efficiency, is proposed to solve large-scale nonsmooth convex problems. The new method owns the search direction, which generates sufficient descent property and belongs to a trust region. Under some suitable conditions, the global convergence of the proposed algorithm is analyzed for nonsmooth convex problems. The numerical efficiency of the proposed algorithm is tested and compared with some existing methods on some large-scale nonsmooth academic test problems. The numerical results show that the new algorithm has a very good performance in solving large-scale nonsmooth problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Zhang, H., Wu, Q., Nguyen, T., Sun, X.: Synthetic aperture radar image segmentation by modified student’s t-mixture model. IEEE Trans. Geosci. Remote Sens. 52, 4391–4403 (2014)

    Google Scholar 

  2. Li, J., Li, X., Yang, B., Sun, X.: Segmentation-based image copy-move forgery detection scheme. IEEE Trans. Inf. Forensics Secur. 10, 507–518 (2015)

    Google Scholar 

  3. Mäkelä, M.M.: Survey of bundle method for nonsmooth optimization. Optim. Math. Softw. 17, 1–29 (2002)

    MathSciNet  MATH  Google Scholar 

  4. Chen, S.S., Donoho, D.L., Saunders, M.A.: Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20, 33–61 (1998)

    MathSciNet  MATH  Google Scholar 

  5. Kärkkäinen, T., Majava, K., Mäkelä, M.M.: Comparison of formulations and solution methods for image restoration problems. Inverse Probl. 17, 1977–1995 (2001)

    MathSciNet  MATH  Google Scholar 

  6. Li, Y., Yuan, G., Sheng, Z.: An active-set algorithm for solving large-scale nonsmooth optimization models with box constraints. PLoS One 13, 1–16 (2018)

    Google Scholar 

  7. Mäkelä, M.M., Neittaanmäkai, P.: Nonsmooth Optimization: Analysis and Algorithms with Applications to Optimal Control. World Scientific Pub. Co., Singapore (1992)

    Google Scholar 

  8. Lemaréchal, C., Sagastizábal, C.: Practical aspects of the Moreu–Yosida regularization: theoretical preliminaries. SIAM J. Optim. 7, 367–385 (1997)

    MathSciNet  MATH  Google Scholar 

  9. Fukushima, M., Qi, L.: A global and superlinearly convergent algorithm for nonsmooth convex minimization. SIAM J. Optim. 6, 1106–1120 (1996)

    MathSciNet  MATH  Google Scholar 

  10. Rockafellar, R.T.: Monotone operators and proximal point algorithm. SIAM J. Control Optim. 14, 877–898 (1976)

    MathSciNet  MATH  Google Scholar 

  11. Birge, J.R., Qi, L., Wei, Z.: Convergence analysis of some methods for minimizing a nonsmooth convex function. J. Optim. Theory Appl. 97, 357–383 (1998)

    MathSciNet  MATH  Google Scholar 

  12. Bonnans, J.F., Gilbert, J.C., Lemaréchal, C., Sagastizábal, C.A.: A family of variable metric proximal methods. Math. Program. 68, 15–47 (1995)

    MathSciNet  MATH  Google Scholar 

  13. Wei, Z., Qi, L.: Convergence analysis of a proximal Newton method. Numer. Funct. Anal. Optim. 17, 463–472 (1996)

    MathSciNet  MATH  Google Scholar 

  14. Yuan, G., Wei, Z.: The Barzilai and Browein gradient method with nonmonotone line search for nonsmooth convex optimization problems. Math. Model. Anal. 17, 203–216 (2012)

    MathSciNet  MATH  Google Scholar 

  15. Sagara, N., Fukushima, M.: A trust region method for nonsmooth convex optimization. J. Ind. Manag. Optim. 1, 171–180 (2005)

    MathSciNet  MATH  Google Scholar 

  16. Yuan, G., Wei, Z., Wang, Z.: Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization. Comput. Optim. Appl. 54, 45–64 (2013)

    MathSciNet  MATH  Google Scholar 

  17. Kiwiel, K.C.: Proximity control in bundle methods for convex nondifferentiable optimization. Math. Program. 46, 105–122 (1990)

    MATH  Google Scholar 

  18. Luksan, L., Vlcek, J.: A bundle-Newton method for nonsmooth unconstrained minimization. Math. Program. 83, 373–391 (1998)

    MathSciNet  MATH  Google Scholar 

  19. Kiwiel, K.C.: Methods of Descent for Nondifferentiable Optimization. Lecture Notes in Mathematics, vol. 1133. Springer, Berlin (1985)

    MATH  Google Scholar 

  20. Kiwiel, K.C.: Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities. Math. Program. 69, 89–109 (1995)

    MathSciNet  MATH  Google Scholar 

  21. Haarala, M., Mäkelä, M.M.: New limited memory bundle method for large-scale nonsmooth optimization. Optim. Methods Softw. 19, 673–692 (2004)

    MathSciNet  MATH  Google Scholar 

  22. Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18, 32–52 (2013)

    MathSciNet  MATH  Google Scholar 

  23. Hestenes, M.R., Stiefel, S.L.: Methods of conjugate gradients for solving linear equations. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    MATH  Google Scholar 

  24. Polak, E., Ribière, G.: Note sur la convergence de directions conjugees. Rev. Fr. Inform. Rech. Opér. 3, 35–43 (1969)

    MATH  Google Scholar 

  25. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    MathSciNet  MATH  Google Scholar 

  26. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    MathSciNet  MATH  Google Scholar 

  27. Fletcher, R.: Practical Methods of Optimization: Unconstrained Optimization, vol. I, 2nd edn. Wiley, New York (1997)

    MATH  Google Scholar 

  28. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    MathSciNet  MATH  Google Scholar 

  29. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  30. Yuan, G., Wei, Z., Li, G.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255, 86–96 (2014)

    MathSciNet  MATH  Google Scholar 

  31. Yuan, G., Meng, Z., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168, 129–152 (2015)

    MathSciNet  MATH  Google Scholar 

  32. Yuan, G., Wei, Z.: A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems. J. Appl. Math. Comput. 51, 397–412 (2015)

    MathSciNet  MATH  Google Scholar 

  33. Yuan, G., Sheng, Z., Liu, W.: The modified HZ conjugate gradient algorithm for large-scale nonsmooth optimization. PLoS One 11, 1–15 (2016)

    Google Scholar 

  34. Du, S., Chen, M.: A new smoothing modified three-term conjugate gradient method for \(l_1\)-norm minimization problem. J. Inequal. Appl. 105, 1–14 (2018)

    Google Scholar 

  35. Chen, M., Du, S.Q.: The smoothing FR conjugate gradient method for solving a kind of nonsmooth optimization problem with \(l_1\)-norm. Math. Probl. Eng. 2018, 1–9 (2018)

    Google Scholar 

  36. Cui, Z., Yuan, G., Sheng, Z., Liu, W., Wang, X., Duan, X.: A modified BFGS formula using a trust region model for nonsmooth convex minimizations. PLoS One 10, 1–15 (2015)

    Google Scholar 

  37. Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62, 261–273 (1993)

    MathSciNet  MATH  Google Scholar 

  38. Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithm II. Springer, Berlin (1993)

    MATH  Google Scholar 

  39. Calamai, P.H., Moré, J.J.: Projected gradient methods for linear constrained problems. Math. Program. 39, 93–116 (1987)

    MATH  Google Scholar 

  40. Qi, L.: Convergence analysis of some algorithms for solving nonsmooth equations. Math. Oper. Res. 18, 227–245 (1993)

    MathSciNet  MATH  Google Scholar 

  41. Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)

    MATH  Google Scholar 

  42. Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust Region Methods. SIAM, Philadelphia (2000)

    MATH  Google Scholar 

  43. Polak, E.: The conjugate gradient method in extreme problems. Comput. Math. Phys. 9, 94–112 (1969)

    Google Scholar 

  44. Fukushima, M.: A descent algorithm for nonsmooth convex optimization. Math. Program. 30, 163–175 (1984)

    MathSciNet  MATH  Google Scholar 

  45. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the reviewers and editor for their valuable comments, which greatly improve our paper. The authors are also very thankful to Dr. Abba Hagos Weldeslassie Fissuh for his assistance in editing this manuscript. This work is supported by the National Natural Science Foundation of China (Grant No. 11771003).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tsegay Giday Woldu.

Additional information

Communicated by Enrique Zuazua.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Woldu, T.G., Zhang, H., Zhang, X. et al. A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization. J Optim Theory Appl 185, 223–238 (2020). https://doi.org/10.1007/s10957-020-01636-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-020-01636-7

Keywords

Mathematics Subject Classification

Navigation