Skip to main content
Log in

A descent extension of a modified Polak–Ribière–Polyak method with application in image restoration problem

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

A one-parameter extension of the modified Polak–Ribière–Polyak method proposed by Sun and Liu is developed based on the Dai–Liao approach. Two adaptive choices for the parameter of the method are suggested, one of which is obtained by carrying out an eigenvalue analysis, and the other is determined by minimizing the distance between search direction of the method and direction of the three-term conjugate gradient method proposed by Sun and Liu. It is shown that the method may satisfy the sufficient descent condition when its parameter is chosen appropriately. Global convergence analysis is conducted. At last, practical merits of the method are investigated by numerical experiments on a set of CUTEr test functions as well as the well-known image restoration problem. The results show numerical efficiency of the method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Abubakar, A.B., Kumam, P., Awwal, A.M.: Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery. Result Appl. Math. 4, 100069 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  2. Abubakar, A.B., Kumam, P., Malik, M., Chaipunya, P., Ibrahim, A.H.: A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection. AIMS Math. 6(6), 6506–6527 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  3. Aminifard, Z., Babaie-Kafaki, S.: A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions. Calcolo 56(2), 16 (2019)

    Article  MATH  Google Scholar 

  4. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  5. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  6. Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  7. Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)

    MathSciNet  MATH  Google Scholar 

  8. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  9. Andrei, N.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization. Optimization 60(12), 1457–1471 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Andrei, N.: Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization. J. Optim. Theory Appl. 159(1), 159–182 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  11. Awwal, A.M., Kumam, P., Abubakar, A.B.: A modified conjugate gradient method for monotone nonlinear equations with convex constraints. Appl. Numer. Math. 145, 507–520 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  12. Babaie-Kafaki, S.: An eigenvalue study on the sufficient descent property of a modified Polak–Ribière–Polyak conjugate gradient method. Bull. Iran. Math. Soc. 40(1), 235–242 (2014)

    MATH  Google Scholar 

  13. Babaie-Kafaki, S.: A modified three-term conjugate gradient method with sufficient descent property. Appl. Math. J. Chinese Univ. 30, 263–272 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  14. Babaie-Kafaki, S., Ghanbari, R.: A descent extension of the Polak–Ribière–Polyak conjugate gradient method. Comput. Math. Appl. 68(12), 2005–2011 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  15. Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  16. Babaie-Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belg. Math. Soc. Simon Stevin 21(3), 465–477 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  17. Bojari, S., Eslahchi, M.R.: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization. Numer. Algorithms 83(11), 901–933 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  18. Bovik, A.L.: Handbook of Image and Video Processing, 2nd edn. Academic Press, Burlington (2005)

    MATH  Google Scholar 

  19. Cao, J., Wu, J.: A conjugate gradient algorithm and its applications in image restoration. Appl. Numer. Math. 152, 243–252 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  20. Chan, R.H., Ho, C.W., Nikolova, M.: Salt-and-pepper noise removal by median-type noise detectors and detail-preserving regularization. IEEE Trans. Image Process. 14(10), 1479–1485 (2005)

    Article  Google Scholar 

  21. Chen, Y., Cao, M., Yang, Y.: A new accelerated conjugate gradient method for large-scale unconstrained optimization. J. Inequal. Appl. 2019(1), 1–13 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  22. Cheng, W.: A two-term PRP-based descent method. Numer. Funct. Anal. Optim. 28(11–12), 1217–1230 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  23. Dai, Y.H., Han, J.Y., Liu, G.H., Sun, D.F., Yin, H.X., Yuan, Y.X.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 348–358 (1999)

    MathSciNet  MATH  Google Scholar 

  24. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  25. Dai, Z.: Two modified Polak–Ribière–Polyak-type nonlinear conjugate methods with sufficient descent property. Numer. Funct. Anal. Optim. 31(7–9), 892–906 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  26. de Leeuw den Bouter, M.L., van Gijzen, M.B., Remis, R.F.: Conjugate gradient variants for \(\ell _p\)-regularized image reconstruction in low-field MRI. SN Appl. Sci. 1, 1736 (2019)

    Article  Google Scholar 

  27. Dehghani, R., Mahdavi-Amiri, N.: Scaled nonlinear conjugate gradient methods for nonlinear least-squares problems. Numer. Algorithms 82, 1–20 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  28. Deng, S., Wan, Z., Chen, X.: An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems. J. Optim. Theory Appl. 157(3), 820–842 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  29. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  30. Dong, X.L., Han, D.R., Ghanbari, R., Li, X.L., Dai, Z.F.: Some new three-term Hestenes-Stiefel conjugate gradient methods with affine combination. Optimization 66(5), 759–776 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  31. Esmaeili, H., Shabani, S., Kimiaei, M.: A new generalized shrinkage conjugate gradient method for sparse recovery. Calcolo 56(1), 1–38 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  32. Exl, L., Fischbacher, J., Oezelt, H., Gusenbauer, M., Schrefl, T.: Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization. Comput. Phys. Commun. 235, 179–186 (2019)

    Article  Google Scholar 

  33. Feng, H., Li, T.: An accelerated conjugate gradient algorithm for solving nonlinear monotone equations and image restoration problems. Math. Prob. Eng. 2020(1), 1–13 (2020)

    MathSciNet  MATH  Google Scholar 

  34. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  35. Hager, W.W., Zhang, H.: Algorithm 851: \(\text{ CG}_{-}\)Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  36. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  37. Heravi, A.R., Hodtani, G.A.: A new correntropy-based conjugate gradient backpropagation algorithm for improving training in neural networks. IEEE Trans. Neural Netw. Learn. Syst. 29(12), 6252–6263 (2018)

    Article  Google Scholar 

  38. Ibrahim, A.H., Kumam, P., Abubakar, A.B., Abubakar, J., Muhammad, A.B.: Least-square-based three-term conjugate gradient projection method for \(\ell _1\)-norm problems with application to compressed sensing. Mathematics 8(4), 602 (2020)

    Article  Google Scholar 

  39. Ibrahim, A.H., Kumam, P., Kumam, W.: A family of derivative-free conjugate gradient methods for constrained nonlinear equations and image restoration. IEEE Access 8, 162714–162729 (2020)

    Article  Google Scholar 

  40. Li, W., Liu, Y., Yang, J., Wu, W.: A new conjugate gradient method with smoothing \({L}_{1/2}\) regularization based on a modified secant equation for training neural networks. Neural Process. Lett. 48, 955–978 (2018)

    Article  Google Scholar 

  41. Li, X., Zhang, W., Dong, X.: A class of modified FR conjugate gradient method and applications to non-negative matrix factorization. Comput. Math. Appl. 73, 270–276 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  42. Lin, J., Jiang, C.: An improved conjugate gradient parametric detection based on space-time scan. Signal Process. 169, 107412 (2020)

    Article  Google Scholar 

  43. Lin, N., Chen, Y., Lu, L.: Mineral potential mapping using a conjugate gradient logistic regression model. Nat. Resour. Res. 29, 173–188 (2020)

    Article  Google Scholar 

  44. Liu, Y., Zhang, L., Lian, Z.: Conjugate gradient algorithm in the four-dimensional variational data assimilation system in GRAPES. J. Meteorol. Res. 32, 974–984 (2018)

    Article  Google Scholar 

  45. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)

    MATH  Google Scholar 

  46. Ou, Y., Lin, H.: A class of accelerated conjugate-gradient-like methods based on a modified secant equation. J. Ind. Manag. Optim. 16(3), 1503 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  47. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  48. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Française Informat. Recherche Opérationnelle 3(16), 35–43 (1969)

    MathSciNet  MATH  Google Scholar 

  49. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  MATH  Google Scholar 

  50. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12(2), 241–254 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  51. Stanimirović, P.S., Ivanov, B., Djordjević, S., Brajević, I.: New hybrid conjugate gradient and Broyden–Fletcher–Goldfarb–Shanno conjugate gradient methods. J. Optim. Theory Appl. 178(3), 860–884 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  52. Sun, M., Liu, J.: Three modified Polak–Ribière–Polyak conjugate gradient methods with sufficient descent property. J. Inequal. Appl. 2015, 125 (2015)

    Article  MATH  Google Scholar 

  53. Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)

    MATH  Google Scholar 

  54. Wan, Z., Yang, Z.L., Wang, Y.L.: New spectral PRP conjugate gradient method for unconstrained optimization. Appl. Math. Lett. 24(1), 16–22 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  55. Wang, X.Y., Li, S.J., Kou, X.P.: A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints. Calcolo 53, 133–145 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  56. Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)

    Book  MATH  Google Scholar 

  57. Yu, G., Guan, L., Li, G.: Global convergence of modified Polak–Ribière–Polyak conjugate gradient methods with sufficient descent property. J. Ind. Manag. Optim. 4(3), 565–579 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  58. Yu, G., Huang, J., Zhou, Y.: A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 23(5), 555–560 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  59. Yuan, G., Li, T., Hu, W.: A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration. J. Inequal. Appl. 2019(1), 247 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  60. Yuan, G., Li, T., Hu, W.: A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems. Appl. Numer. Math. 147, 129–141 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  61. Yuan, G., Lu, J., Wang, Z.: The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl. Numer. Math. 152, 1–11 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  62. Yuan, G., Meng, Z., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168(1), 129–152 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  63. Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3(1), 11–21 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  64. Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  65. Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request. The authors are grateful to Professor Michael Navon for providing the strong Wolfe line search code. They also thank the anonymous reviewers for their valuable comments that helped to improve the quality of this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Babaie-Kafaki, S., Mirhoseini, N. & Aminifard, Z. A descent extension of a modified Polak–Ribière–Polyak method with application in image restoration problem. Optim Lett 17, 351–367 (2023). https://doi.org/10.1007/s11590-022-01878-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-022-01878-6

Keywords