Skip to main content

Advertisement

Log in

A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

In the framework of large-scale optimization problems, the standard BFGS method is not affordable due to memory constraints. The so-called limited-memory BFGS (L-BFGS) method is an adaption of the BFGS method for large-scale settings. However, the standard BFGS method and therefore the standard L-BFGS method only use the gradient information of the objective function and neglect function values. In this paper, we propose a new regularized L-BFGS method for solving large scale unconstrained optimization problems in which more available information from the function and gradient values are employed to approximate the curvature of the objective function. The proposed method utilizes a class of modified quasi-Newton equations in order to achieve higher order accuracy in approximating the second order curvature of the objective function. Under some standard assumptions, we provide the global convergence property of the new method. In order to provide an efficient method for finding global minima of a continuously differentiable function, a hybrid algorithm that combines a genetic algorithm (GA) with the new proposed regularized L-BFGS method has been proposed. This combination leads the iterates to a stationary point of the objective function with higher chance of being global minima. Numerical results show the efficiency and robustness of the new proposed regularized L-BFGS and its hybridized version with GA in practice.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Al-Baali, M.: Improved Hessian approximations for limited memory BFGS method. Numer. Algorithms 22, 99–112 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  2. Al-Baali, M., Grandinetti, L., Pisacane, O.: Damped techniques for the limited memory BFGS method for large-scale optimization. J. Optim. Theory Appl. 161(2), 688–699 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  3. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MATH  MathSciNet  Google Scholar 

  4. Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234, 1374–1386 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  5. Biglari, F., Hassan, M.A., Leong, W.J.: New quasi Newton methods via higher order tensor models. J. Comput. Appl. Math. 235, 2412–2422 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  6. Broyden, C.G., Dennis, J.E., Moré, J.J.: On the local and superlinear convergence of quasi-Newton methods. IMA J. Appl. Math. 12(3), 223–245 (1973)

    Article  MATH  Google Scholar 

  7. Byrd, R.H., Lu, P.H., Nocedal, J., Zhu, C.Y.: A limited memory algorithm for bound constrained optimization. SIAM J. Sci. Comput. 16, 1190–1208 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  8. Byrd, R.H., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26(3), 727–739 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  9. Byrd, R.H., Nocedal, J., Schnabel, R.B.: Representations of quasi-Newton matrices and their use in limited memory methods. Math. Program. A 63(2), 129–156 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  10. Byrd, R.H., Nocedal, J., Yuan, Y.: Global convergence of a class of quasi Newton methods on convex problems. SIAM J. Numer. Anal. 24(5), 1171–1190 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  11. Chen, X.J.: Convergence of the BFGS method for LC1 convex constrained optimization. SIAM J. Control Optim. 34, 2051–2063 (1996)

    Article  MathSciNet  Google Scholar 

  12. Dai, Y.: Convergence properties of the BFGS algorithm. SIAM J. Optim. 13(3), 693–701 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  13. Davis, L.: The Handbook of Genetic Algorithms. Van Nostrand Reingold, New York (1991)

    Google Scholar 

  14. Dembo, R., Steihaug, T.: Truncated Newton algorithms for large-scale unconstrained optimization. Math. Program. 26(2), 190–212 (1983)

    Article  MATH  MathSciNet  Google Scholar 

  15. Dennis, J.E., Moré, J.J.: Quasi-Newton methods, motivation and theory. SIAM Rev. 19, 46–89 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  16. Dolan, E.D., Moré, J.J.: Benchmarking optimizations of tware with performance profiles. Math. Program. 91(2), 201–203 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  17. Fletcher, R.: Practical Methods of Optimization. A Wiley-Interscience Publication, 2nd edn. Wiley, Chichester (1987)

    Google Scholar 

  18. Gould, N., Orban, D., Toint, P.L.: CUTEr, a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  19. La Cruz, W., Noguera, G.: Hybrid spectral gradient method for the unconstrained minimization problem. J. Glob. Optim. 44(2), 193–212 (2009)

    Article  MATH  Google Scholar 

  20. Li, D., Fukushima, M.: A modified BFGS method and its global convergence in non convex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  21. Li, B., Ong, Y.S., Le, M.N., Goh, C.K.: Memetic gradient search. In: Evolutionary Computation, IEEE World Congress on Computational Intelligence, pp. 2894–2901 (2008)

  22. Liu, T.W.: A regularized limited memory BFGS method for nonconvex unconstrained minimization. Numer. Algorithms 65(2), 305–323 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  23. Liu, T.W., Li, D.H.: A pratical update criterion for SQP method. Optim. Methods Softw. 22(2), 253–266 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  24. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  25. Mascarenhas, W.F.: The BFGS method with exact line searches fails for non-convex objective functions. Math. Program. A 99(1), 49–61 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  26. Nash, S.G., Nocedal, J.: A numerical study of a limited memory BFGS method and the truncated Newton method for large-scale optimization. SIAM J. Optim. 1(3), 358–372 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  27. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980)

    Article  MATH  MathSciNet  Google Scholar 

  28. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer Series in Operations Research, 2nd edn. Springer, Berlin (2006)

    Google Scholar 

  29. Wei, Z., Li, G., Qi, L.: New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175, 1156–1188 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  30. Xiao, Y., Wei, Z., Wang, Z.: A limited memory BFGS-type method for large-scale unconstrained optimization. Comput. Math. Appl. 56, 1001–1009 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  31. Xu, C.X., Zhang, J.Z.: Properties and Numerical Performance of Quasi Newton Methods with Modified Quasi-Newton Equations, Technical Report. Department of Mathematics, City University of Hong Kong (1999)

  32. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  33. Zhang, J.Z., Xu, C.X.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137, 269–278 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  34. Zhou, W., Zhang, L.: Global convergence of a regularized factorized quasi-Newton method for nonlinear least squares problems. Comput. Appl. Math. 29, 195–214 (2010)

    MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The author would like to thank the research council of K.N. Toosi University of Technology and the SCOPE research center. The authors would also like to thank Prof. M. Al-Baali for his helpful comments on this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. Reza Peyghami.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tarzanagh, D.A., Peyghami, M.R. A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems. J Glob Optim 63, 709–728 (2015). https://doi.org/10.1007/s10898-015-0310-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-015-0310-7

Keywords

Navigation