Skip to main content
Log in

Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

In Dai and Wen (Numer. Algor. 69, 337–341 2015), some improvements have been presented in the proof of Theorem 2 and Theorem 4 in Andrei (Numer. Algor. 47, 143–156 2008). However, due to incorrect inequalities used, the reasoning proof of Theorem 2.1 in Dai and Wen (Numer. Algor. 69, 337–341 2015) is incorrect. Moreover, the assumption on 0 < c1𝜃k < 1 of Theorem 2.1 cannot be deleted in Dai and Wen (Numer. Algor. 69, 337–341 2015). In this paper, the necessary corrections are made. Finally, another version of the proof process of Theorem 3.1 in (Numer. Algor. 69, 337–341 2015) is given. Throughout, we use the same notations and equation numbers as in Dai and Wen (Numer. Algor. 69, 337–341 2015), Andrei (Numer. Algor. 47, 143–156 2008).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Dai, Z.F., Wen, F.H.: Comments on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei. Numer. Algor. 69, 337–341 (2015)

    Article  MathSciNet  Google Scholar 

  2. Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algor. 47, 143–156 (2008)

    Article  MathSciNet  Google Scholar 

  3. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  4. Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23(4), 2150–2168 (2013)

    Article  MathSciNet  Google Scholar 

  5. Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  Google Scholar 

  6. Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)

    Article  MathSciNet  Google Scholar 

  7. Al-Baali, M.: Descent property and global convergence of the Fletcher-Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    Article  MathSciNet  Google Scholar 

  8. Zheng, X., Liu, H., Lu, A.: Sufficient descent conjugate gradient methods for large-scale optimization problems. Int. J. Comput. Math. 88(16), 3436–3447 (2011)

    Article  MathSciNet  Google Scholar 

  9. Zheng, X., Shi, J.: A modified sufficient descent Polak-Ribiėre-Polyak type conjugate gradient method for unconstrained optimization problems. Algorithms 11 (9), 133 (2018)

    Article  Google Scholar 

  10. Dong, X., Liu, H., He, Y.: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 165 (1), 225–241 (2015)

    Article  MathSciNet  Google Scholar 

  11. Dong, X., Han, D., Dai, Z., Li, X., Zhu, J.: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 179(3), 944–961 (2018)

    Article  MathSciNet  Google Scholar 

  12. Dai, Z.F., Wen, F.H.: Two nonparametric approaches to mean absolute deviation portfolio selection model, J. Ind. Manag. Optim. https://doi.org/10.3934/jimo.2019054(2019)

  13. Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60(1), 89–110 (2015)

    Article  MathSciNet  Google Scholar 

  14. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  15. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

We are grateful to the anonymous referees for their valuable comments and suggestions that helped to improve the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (11601012), First-Class Disciplines Foundation of Ningxia (NXYLXK2017B09), and Shaanxi Province Natural Science Fund of China (2019JM252).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiuyun Zheng.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zheng, X., Dong, X., Shi, J. et al. Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei. Numer Algor 84, 603–608 (2020). https://doi.org/10.1007/s11075-019-00771-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-019-00771-1

Keywords

Navigation