Skip to main content
Log in

A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees

  • Full Length Paper
  • Published:
Mathematical Programming Computation Aims and scope Submit manuscript

Abstract

A line search algorithm for minimizing nonconvex and/or nonsmooth objective functions is presented. The algorithm is a hybrid between a standard Broyden–Fletcher–Goldfarb–Shanno (BFGS) and an adaptive gradient sampling (GS) method. The BFGS strategy is employed because it typically yields fast convergence to the vicinity of a stationary point, and together with the adaptive GS strategy the algorithm ensures that convergence will continue to such a point. Under suitable assumptions, it is proved that the algorithm converges globally with probability one. The algorithm has been implemented in C\(++\) and the results of numerical experiments illustrate the efficacy of the proposed approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Anstreicher, K.M., Lee, J.: A masked spectral bound for maximum-entropy sampling. In: MODA 7—Advances in Model-Oriented Design and Analysis, Berlin, pp. 1–10 (2004)

  2. Ben-Tal, A., Nemirovski, A.: Robust optimization—methodology and applications. Math. Program. Ser. B 92(3), 453–480 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  3. Bertsekas, D.P.: Convex Optimization Theory. Athena Scientific, Nashua (2009)

    MATH  Google Scholar 

  4. Bonnans, J.F., Gilbert, J.C., Lemaréchal, C., Sagastizábal, C.A.: Numerical Optimization: Theoretical and Practical Aspects. Springer, Berlin (2006)

    Google Scholar 

  5. Broyden, C.G.: The convergence of a class of double-rank minimization algorithms. J. Inst. Math. Appl. 6(1), 76–90 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  6. Burke, J.V., Lewis, A.S., Overton, M.L.: Approximating subdifferentials by random sampling of gradients. Math. Oper. Res. 27(3), 567–584 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  7. Burke, J.V., Lewis, A.S., Overton, M.L.: A robust gradient sampling algorithm for nonsmooth, nonconvex optimization. SIAM J. Optim. 15(3), 751–779 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  8. Candès, E.J., Tao, T.: Near optimal signal recovery from random projections: universal encoding strategies. IEEE Trans. Inf. Theory 52(12), 5406–5425 (2006)

    Article  MATH  Google Scholar 

  9. Chartrand, R.: Fast algorithms for nonconvex compressive sensing: MRI reconstruction from very few data. In: IEEE International Symposium on Biomedical Imaging (ISBI’09), pp. 262–265 (2009)

  10. Chen, X.: Smoothing methods for nonsmooth, nonconvex optimization. Math. Program. Ser. B 134(1), 71–99 (2012)

    Article  MATH  Google Scholar 

  11. Chen, X., Niu, L., Yuan, Y.: Optimality conditions and a smoothing trust region newton method for non-Lipschitz optimization. SIAM J. Optim. 23(3), 1528–1552 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  12. Clarke, F.H.: Optimization and nonsmooth analysis. In: Canadian Mathematical Society Series of Monographs and Advanced Texts. Wiley, New York (1983)

  13. Curtis, F.E., Overton, M.L.: A sequential quadratic programming algorithm for nonconvex, nonsmooth constrained optimization. SIAM J. Optim. 22(2), 474–500 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  14. Curtis, F.E., Que, X.: An Adaptive gradient sampling algorithm for nonsmooth optimization. Optim. Methods Softw. (2012). doi:10.1080/10556788.2012.714781

  15. Dolan, E., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  16. Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  17. Fletcher, R.: A new approach to variable metric algorithms. Comput. J. 13(3), 317–322 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  18. Goldfarb, D.: A family of variable metric updates derived by variational means. Math. Comput. 24(109), 23–26 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  19. Goldfarb, D., Iyengar, G.: Robust portfolio selection problems. Math. Oper. Res. 28(1), 1–38 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  20. Goldstein, A.A.: Optimization of Lipschitz continuous functions. Math. Program. 13(1), 14–22 (1977)

    Article  MATH  Google Scholar 

  21. Greif, C., Varah, J.: Minimizing the condition number for small rank modifications. SIAM J. Matrix Anal. Appl. 29(1), 82–97 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  22. Guenter, S., Sempf, M., Merkel, P., Strumberger, E., Tichmann, C.: Robust control of resistive wall modes using pseudospectra. New J. Phys. 11(053015), 1–40 (2009)

    Google Scholar 

  23. Gumussoy, S., Henrion, D., Millstone, M., Overton, ML.: Multiobjective robust control with HIFOO 2.0. In: Proceedings of the IFAC Symposium on Robust Control Design, Haifa (2009)

  24. Haarala, M.: Large-scale nonsmooth optimization: variable metric bundle method with limited memory. PhD thesis, University of Jyväskylä (2004)

  25. Haarala, M., Miettinen, K., Mäkelä, M.M.: New limited memory bundle method for large-scale nonsmooth optimization. Optim. Methods Softw. 19(6), 673–692 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  26. Hare, W., Macklem, M.: Derivative-free optimization methods for finite minimax problems. Optim. Methods Softw. 28(2), 300–312 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  27. Hare, W., Nutini, J.: A derivative-free approximate gradient sampling algorithm for finite minimax problems. Comput. Optim. Appl. 56(1), 1–38 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  28. Hiriart-Urruty, J.B., Lemaréchal, C.: Convex analysis and minimization algorithms II. In: A Series of Comprehensive Studies in Mathematics. Springer, New York (1993)

  29. Karmitsa, N.: Limited memory bundle method (2014). http://napsu.karmitsa.fi/lmbm/. Accessed 12 May 2014

  30. Kiwiel, K.C.: Methods of descent for nondifferentiable optimization. In: Lecture Notes in Mathematics. Springer, New York (1985)

  31. Kiwiel, K.C.: A method for solving certain quadratic programming problems arising in nonsmooth optimization. IMA J. Numer. Anal. 6(2), 137–152 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  32. Kiwiel, K.C.: Convergence of the gradient sampling algorithm for nonsmooth nonconvex optimization. SIAM J. Optim. 18(2), 379–388 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  33. Kiwiel, K.C.: A nonderivative version of the gradient sampling algorithm for nonsmooth nonconvex optimization. SIAM J. Optim. 20(4), 1983–1994 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  34. Lemaréchal, C.: A view of line-searches. In: Optimization and Optimal Control: Proceedings of a Conference Held at Oberwolfach, March 16–22, 1980, Berlin, pp. 59–78 (1981)

  35. Lewis, A.S., Overton, M.L.: Nonsmooth optimization via quasi-Newton methods. Math. Program. 141(1–2), 135–163 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  36. Lukšan, L., Tůma, M., Šiška, M., Vlček, J., Ramešová, N.: UFO 2002: Interactive System for Universal Functional Optimization. Tech. Rep. 883, Institute of Computer Science, Academy of Sciences of the Czech Republic (2002)

  37. Mifflin, R.: An algorithm for constrained optimization with semismooth functions. Math. Oper. Res. 2(2), 191–207 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  38. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)

    Article  MATH  MathSciNet  Google Scholar 

  39. Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, New York (2006)

    MATH  Google Scholar 

  40. Overton, M.L.: HANSO: hybrid algorithm for non-smooth optimization (2014). http://www.cs.nyu.edu/faculty/overton/software/hanso/. Accessed 12 May 2014

  41. Rockafellar, R.T.: Nonsmooth optimization. In: Birge, J.R., Murty, K.G. (eds.) Mathematical Programming: The State of the Art 1994, pp. 248–258. University of Michigan Press, Ann Arbor (1994)

    Google Scholar 

  42. Sanderson, C., Curtin, R.: Armadillo C\(++\) linear algebra library (2014). http://arma.sourceforge.net/. Accessed 12 May 2014

  43. Shanno, D.F.: Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24(111), 647–656 (1970)

    Article  MathSciNet  Google Scholar 

  44. Skajaa, A.: Limited memory BFGS for nonsmooth optimization. Master’s thesis, New York University, New York (2010)

  45. Vanbiervliet, J., Verheyden, K., Michiels, W., Vandewalle, S.: A nonsmooth optimisation approach for the stabilisation of time-delay systems. ESAIM Control Optim. Calc. Var. 14(3), 478–493 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  46. Vanbiervliet, J., Vandereycken, B., Michiels, W., Vandewalle, S., Diehl, M.: The smoothed spectral abscissa for robust stability optimization. SIAM J. Optim. 20(1), 156–171 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  47. Watson, G.A.: Data fitting problems with bounded uncertainties in the data. SIAM J. Matrix Anal. Appl. 22(4), 1274–1293 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  48. Wolfe, P.: A method of conjugate subgradients for minimizing nondifferentiable functions. Math. Program. Stud. 3, 145–173 (1975)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaocun Que.

Additional information

F. E. Curtis and X. Que were supported in part by National Science Foundation Grant DMS-1016291.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Curtis, F.E., Que, X. A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees. Math. Prog. Comp. 7, 399–428 (2015). https://doi.org/10.1007/s12532-015-0086-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12532-015-0086-2

Keywords

Mathematics Subject Classification

Navigation