Skip to main content
Log in

The Projection Technique for Two Open Problems of Unconstrained Optimization Problems

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

There are two problems for nonconvex functions under the weak Wolfe–Powell line search in unconstrained optimization problems. The first one is the global convergence of the Polak–Ribière–Polyak conjugate gradient algorithm and the second is the global convergence of the Broyden–Fletcher–Goldfarb–Shanno quasi-Newton method. Many scholars have proven that the two problems do not converge, even under an exact line search. Two circle counterexamples were proposed to generate the nonconvergence of the Polak–Ribière–Polyak algorithm for the nonconvex functions under the exact line search, which inspired us to define a new technique to jump out of the circle point and obtain the global convergence. Thus, a new Polak–Ribière–Polyak algorithm is designed by the following steps. (i) Given the current point and a parabolic surface is designed; (ii) An assistant point is defined based on the current point; (iii) The assistant point is projected onto the surface to generate the next point; (iv) The presented algorithm has the global convergence for nonconvex functions with the weak Wolfe–Powell line search. A similar technique is used for the quasi-Newton method to get a new quasi-Newton algorithm and to establish its global convergence. Numerical results show that the given algorithms are more competitive than other similar algorithms. Meanwhile, the well-known hydrologic engineering application problem, called parameter estimation problem of nonlinear Muskingum model, is also done by the proposed algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Dai, Y., Yuan, Y.: A nonlinear conjugate gradient with a strong global convergence properties. SIAM J. Optim. 10, 177–182 (2000)

    MATH  Google Scholar 

  2. Fletcher, R.: Practical Methods of Optimization, 2nd edn. Wiley, New York (1987)

    MATH  Google Scholar 

  3. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    MathSciNet  MATH  Google Scholar 

  4. Hestenes, M.R., Stiefel, E.: Method of conjugate gradient for solving linear equations. J Res. Nation. Bur. Stand. 49, 409–436 (1952)

    MATH  Google Scholar 

  5. Liu, Y., Storey, C.: Effcient generalized conjugate gradient algorithms part 1: theory. J. Appl. Math. Comput. 69, 17–41 (1992)

    Google Scholar 

  6. Polak, E.: The conjugate gradient method in extreme problems. Comput. Math. Math. Phys. 9, 94–112 (1969)

    Google Scholar 

  7. Polak, E., Ribière, G.: Note sur la convergence de directions conjugees. Rev. Fran. Inf. Rech. Opérat. 3, 35–43 (1969)

    MATH  Google Scholar 

  8. Yuan, Y.: Analysis on the conjugate gradient method. Optim. Methods Soft. 2, 19–29 (1993)

    Google Scholar 

  9. Dai, Y.: Analysis of Conjugate Gradient Methods. Ph.D. Thesis, Institute of Computational Mathematics and Scientific/Engineering Computing, Chese Academy of Sciences (1997)

  10. Powell, M.J.D.: Nonconvex Minimization Calculations and the Conjugate Gradient Method. Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Spinger, Berlin (1984)

    Google Scholar 

  11. Dai, Y.: Convergence properties of the BFGS algorithm. SIAM J. Optim. 13, 693–701 (2003)

    MathSciNet  MATH  Google Scholar 

  12. Powell, M.J.D.: Convergence properties of algorithm for nonlinear optimization. SIAM Rev. 28, 487–500 (1986)

    MathSciNet  MATH  Google Scholar 

  13. Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)

    MathSciNet  MATH  Google Scholar 

  14. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    MathSciNet  MATH  Google Scholar 

  15. Wei, Z., Yao, S., Liu, L.: The convergence properties of some new conjugate gradient methods. Appl. Math. Comput. 183, 1341–1350 (2006)

    MathSciNet  MATH  Google Scholar 

  16. Yuan, G., Lu, X.: A modified PRP conjugate gradient method. Ann. Oper. Res. 166, 73–90 (2009)

    MathSciNet  MATH  Google Scholar 

  17. Yuan, G., Lu, X., Wei, Z.: A conjugate gradient method with descent direction for unconstrained optimization. J. Comput. Appl. Math. 233, 519–530 (2009)

    MathSciNet  MATH  Google Scholar 

  18. Yuan, G., Meng, Z., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theor. Appl. 168, 129–152 (2016)

    MathSciNet  MATH  Google Scholar 

  19. Yuan, G., Wei, Z., Li, G.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255, 86–96 (2014)

    MathSciNet  MATH  Google Scholar 

  20. Yuan, G., Wei, Z., Yang, Y.: The global convergence of the Polak–Ribière–Polyak conjugate gradient algorithm under inexact line search for nonconvex functions. J. Comput. Appl. Math. 362, 262–275 (2019)

    MathSciNet  MATH  Google Scholar 

  21. Yuan, G., Zhang, M.: A three-terms Polak–Ribière–Polyak conjugate gradient algorithm for large-scale nonlinear equations. J. Comput. Appl. Math. 286, 186–195 (2015)

    MathSciNet  MATH  Google Scholar 

  22. Zhou, W.: A short note on the global convergence of the unmodified PRP method. Optim. Lett. 7, 1367–1372 (2013)

    MathSciNet  MATH  Google Scholar 

  23. Zhou, W., Li, D.: On the convergence properties of the unmodified PRP method with a non-descent line search. Optim. Methods Softw. 29, 484–496 (2014)

    MathSciNet  MATH  Google Scholar 

  24. Dai, Z., Zhu, H.: A modified Hestenes-Stiefel-type derivative-free method for large-scale nonlinear monotone equations. Mathematics 8, 168 (2020)

    Google Scholar 

  25. Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    MathSciNet  MATH  Google Scholar 

  26. Hager, W., Zhang, H.: Algorithm 851: \(CG\_DESCENT,\) A conjugate gradient method with guaranteed descent. ACM Trans. Math. Soft. 32, 113–137 (2006)

    MATH  Google Scholar 

  27. Grippo, L., Lucidi, S.: A globally convergent version of the Polak–Ribière–Polyak conjugate gradient method. Math. Program. 78, 375–391 (1979)

    MATH  Google Scholar 

  28. Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)

    MathSciNet  MATH  Google Scholar 

  29. Yuan, G., Wei, Z., Lu, X.: Global convergence of the BFGS method and the PRP method for general functions under a modified weak Wolfe–Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    MathSciNet  MATH  Google Scholar 

  30. Ahmed, T., Storey, D.: Efficient hybrid conjugate gradient techniques. J. Optim. Theor. Appl. 64, 379–394 (1990)

    MathSciNet  MATH  Google Scholar 

  31. Al-Baali, A.: Descent property and global convergence of the Flecher–Reeves method with inexact line search. IMA J. Numer. Anal. 5, 121–124 (1985)

    MathSciNet  MATH  Google Scholar 

  32. Moscariello, A., Richard, A., Takx, U.: Coronary CT angiography: image quality, diagnostic accuracy, and potential for radiation dose reduction using a novel iterative image reconstruction technique comparisonwith traditional filtered back projection. Eur. Radiol. 21, 2130–2138 (2011)

    Google Scholar 

  33. Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for systems of monotone equations. In: Fukushima, M., Qi, L. (eds.) Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, pp. 355–369. Kluwer Academic Publishers, Berlin (1998)

    Google Scholar 

  34. Solodov, M.V., Svaiter, B.F.: A hybrid projection-proximal point algorithm. J. Convex. Anal. 6, 59–70 (1999)

    MathSciNet  MATH  Google Scholar 

  35. Zhang, S.: Recent progresses on real-time 3D shape measurement using digital fringe projection techniques. Opt. Laser Eng. 48, 149–158 (2010)

    Google Scholar 

  36. Li, Q., Li, D.: A class of derivative-free methods for large-scale nonlinear monotone equations. IMA J. Numer. Anal. 31, 1625–1635 (2011)

    MathSciNet  MATH  Google Scholar 

  37. Liu, J., Li, S.: A three-term derivative-free projection method for systems of nonlinear monotone equations. CALCOLO 53, 427–450 (2016)

    MathSciNet  MATH  Google Scholar 

  38. Liu, J., Li, S.: A projection method for convex constrained monotone nonlinear equations with applications. Comput. Math. Appl. 70, 2442–2453 (2015)

    MathSciNet  Google Scholar 

  39. Liu, J., Li, S.: Spectral gradient method for impulse noise removal. Optim. Lett. 9, 1341–1351 (2015)

    MathSciNet  MATH  Google Scholar 

  40. Wang, C.W., Wang, Y.J., Xu, C.L.: A projection method for a system of nonlinear monotone equations with convex constraints. Math. Method. Oper. Res. 66, 33–46 (2007)

    MathSciNet  MATH  Google Scholar 

  41. Zhang, L., Zhou, W.J.: Spectral gradient projection method for solving nonlinear monotone equations. J. Comput. Appl. Math. 196, 478–484 (2006)

    MathSciNet  MATH  Google Scholar 

  42. Zhang, L.: A derivative-free conjugate residual method using secant condition for general large-scale nonlinear equations. Numer. Algorithms 83, 1–17 (2019)

    MathSciNet  Google Scholar 

  43. Davidon, W.C.: Variable metric methods for minimization. SIAM J. Optim. 1, 1–17 (1991)

    MathSciNet  MATH  Google Scholar 

  44. Fletcher, R., Powell, M.J.D.: A rapidly convergent descent method for minimization. Comput. J. 6, 163–168 (1963)

    MathSciNet  MATH  Google Scholar 

  45. Broyden, C.G.: The convergence of a class of double rank minimization algorithms: 2. The new algorithm. J. Inst. Math. Appl. 6, 222–231 (1970)

    MathSciNet  MATH  Google Scholar 

  46. Fletcher, R.: A new approach to variable metric algorithms. Comput. J. 13, 317–322 (1970)

    MATH  Google Scholar 

  47. Goldfarb, A.: A family of variable metric methods derived by variational means. Math. Comput. 24, 23–26 (1970)

    MathSciNet  MATH  Google Scholar 

  48. Schanno, J.: Conditions of quasi-Newton methods for function minimization. Math. Comput. 24, 647–650 (1970)

    Google Scholar 

  49. Powell, M.J.D.: On the convergence of the variable metric algorithm. J. Inst. Math. Appl. 7, 21–36 (1971)

    MathSciNet  MATH  Google Scholar 

  50. Dixon, L.C.W.: Variable metric algorithms: Nessary and sufficient conditions for identical behavior on nonquadratic functions. J. Optim. Theory Appl. 10, 34–40 (1972)

    MathSciNet  MATH  Google Scholar 

  51. Powell, M.J.D.: Some global convergence properties of a variable metric algorithm for minimization without exact line searches. In: Cottle, R.W., Lemke, C.E. (eds.) Nonlinear Programming, SIAM-AMS Proceedings, Volume IX, pp. 53–72. SIAM, Philadelphia (1976)

  52. Byrd, R., Nocedal, J., Yuan, Y.: Global convergence of a class of quasi-Newton methods on convexproblems. SIAM J Numer. Anal. 24, 1171–1189 (1987)

    MathSciNet  MATH  Google Scholar 

  53. Broyden, C.G., Dennis, J.E., Moré, J.J.: On the local and supelinear convergence of quasi-Newton methods. J. Inst. Math. Appl. 12, 223–246 (1973)

    MathSciNet  MATH  Google Scholar 

  54. Byrd, R., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26, 727–739 (1989)

    MathSciNet  MATH  Google Scholar 

  55. Griewank, A.: The global convergence of partioned BFGS on problems with convex decompositons and Lipschitzian gradients. Math. Program. 50, 141–175 (1991)

    MATH  Google Scholar 

  56. Toint, PhL: Global convergence of the partioned BFGS algorithm for convex partially separable opertimization. Math. Program. 36, 290–306 (1986)

    MATH  Google Scholar 

  57. Dennis, J.E., Moré, J.J.: Quasi-Newton methods, motivation and theory. SIAM Rev. 19, 46–89 (1977)

    MathSciNet  MATH  Google Scholar 

  58. Dennis, J.E., Moré, J.J.: A characteization of superlinear convergence and its application to quasi-Newton methods. Math. Comput. 28, 549–560 (1974)

    MATH  Google Scholar 

  59. Griewank, A., Toint, PhL: Local convergence analysis for partitioned quasi-Newton updates. Numer. Math. 39, 429–448 (1982)

    MathSciNet  MATH  Google Scholar 

  60. Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)

    MathSciNet  MATH  Google Scholar 

  61. Li, D.H., Fukushima, M.: On the global convergence of the BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11, 1054–1064 (2001)

    MathSciNet  MATH  Google Scholar 

  62. Powell, M.J.D.: A new algorithm for unconstrained optimation. In: Rosen, J.B., Mangasarian, O.L., Ritter, K. (eds.) Nonlinear Programming, pp. 31–65. Academic Press, New York (1970)

    Google Scholar 

  63. Wei, Z., Qi, L., Chen, X.: An SQP-type method and its application in stochastic programming. J Optim. Theor. Appl. 116, 205–228 (2003)

    MATH  Google Scholar 

  64. Wei, Z., Yu, G., Yuan, G., Lian, Z.: The superlinear convergence of a modified BFGS-type method for unconstrained optimization. Comput. Optim. Appl. 29, 315–332 (2004)

    MathSciNet  MATH  Google Scholar 

  65. Yuan, G., Wei, Z.: Convergence analysis of a modified BFGS method on convex minimizations. Comput. Optim. Appl. 47, 237–255 (2010)

    MathSciNet  MATH  Google Scholar 

  66. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theor. Appl. 102, 147–167 (1999)

    MathSciNet  MATH  Google Scholar 

  67. Zhang, L., Tang, H.: A hybrid MBFGS and CBFGS method for nonconvex minimization with a global complexity bound. Pacific J. Optim. 14, 693–702 (2018)

    MathSciNet  Google Scholar 

  68. Dai, Y.: A perfect example for the BFGS method. Math. Program. 138, 501–530 (2013)

    MathSciNet  MATH  Google Scholar 

  69. Mascarenhas, W.: The divergence of the BFGS and Gauss Newton methods. Math. Program. 147, 253–276 (2014)

    MathSciNet  MATH  Google Scholar 

  70. Zhou, W.: A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems. J. Comput. Appl. Math. 367, 1122454 (2020)

    MathSciNet  Google Scholar 

  71. Fletcher, R.: An overview of unconstrained optimization/Spedicato E. Algorithms for Continuous Optimization, the state of the art, pp. 109–143. Kluwer, Berlin (1993)

  72. Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numerica 1, 199–242 (1992)

    MathSciNet  MATH  Google Scholar 

  73. Nemirovski, A., Yudin, D.: Problem Complexity and Method Efficiency in Optimization. Wiley, New York (1983)

    Google Scholar 

  74. Li, D., Zhang, J.: Ten Thousand Science Difficult Problems (Mathematics Issue). Science Press, Beijing (2005)

    Google Scholar 

  75. Yuan, Y.: Convergence of DFP algorithm. Sci. China Ser. A 38, 1281–1294 (1995)

    MathSciNet  MATH  Google Scholar 

  76. Mascarenhas, W.F.: The BFGS method with exact line searchs fails for non-convex objective functions. Math. Program. 99, 49–61 (2004)

    MathSciNet  MATH  Google Scholar 

  77. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  78. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    MathSciNet  MATH  Google Scholar 

  79. Niazkar, M., Afzali, S.H.: Assessment of modified honey bee mating optimization for parameter estimation of nonlinear Muskingum models. J. Hydrol. Eng. 20(4), Article ID 04014055 (2015)

  80. Chu, H., Chang, L.: Applying particle swarm optimization to parameter estimation of the nonlinear Muskingum model. J. Hydrol. Eng. 14, 1024–1027 (2009)

    Google Scholar 

  81. Barati, R.: Parameter estimation of nonlinear Muskingum models using Nelder-Mead simplex algorithm. J. Hydrol. Eng. 16, 946–954 (2011)

    Google Scholar 

  82. Geem, Z.W.: Parameter estimation for the nonlinear Muskingum model using the BFGS technique. J. Hydrol. Eng. 132, 474–478 (2006)

    Google Scholar 

  83. Karahan, H., Gurarslan, G., Geem, Z.: Parameter estimation of the nonlinear Muskingum flood-routing model using a hybrid harmony search algorithm. J. Hydrol. Eng. 18, 352–360 (2013)

    Google Scholar 

  84. Mohan, S.: Parameter estimation of nonlinear Muskingum models using genetic algorithm. J. Hydrol. Eng. 123, 137–142 (1997)

    Google Scholar 

  85. Ouyang, A., Liu, L., Sheng, Z., Wu, F.: A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm. Math. Probl. Eng. 15, Article ID 573894 (2015)

  86. Xu, D., Qiu, L., Chen, S.: Estimation of nonlinear Muskingum model parameter using differential evolution. J. Hydrol. Eng. 17, 348–353 (2012)

    Google Scholar 

  87. Ouyang, A., Tang, Z., Li, K.: Estimating parameters of Muskingum model using an adaptive hybrid PSO algorithm. Int. J. Pattern. Recognit. 28, 29, Article ID 1459003 (2014)

Download references

Acknowledgements

We would like to thank two referees and the editor for giving us many valuable suggestions and comments which improve this paper greatly. This work is supported by the National Natural Science Foundation of China (Grant No. 11661009), the Guangxi Science Fund for Distinguished Young Scholars (No. 2015GXNSFGA139001), and the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gonglin Yuan.

Additional information

Communicated by Johannes O. Royset.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, G., Wang, X. & Sheng, Z. The Projection Technique for Two Open Problems of Unconstrained Optimization Problems. J Optim Theory Appl 186, 590–619 (2020). https://doi.org/10.1007/s10957-020-01710-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-020-01710-0

Keywords

Mathematics Subject Classification

Navigation