Skip to main content
Log in

On the local convergence of a derivative-free algorithm for least-squares minimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In Zhang et al. (accepted by SIAM J. Optim., 2010), we developed a class of derivative-free algorithms, called DFLS, for least-squares minimization. Global convergence of the algorithm as well as its excellent numerical performance within a limited computational budget was established and discussed in the same paper. Here we would like to establish the local quadratic convergence of the algorithm for zero residual problems. Asymptotic convergence performance of the algorithm for both zero and nonzero problems is tested. Our numerical experiments indicate that the algorithm is also very promising for achieving high accuracy solutions compared with software packages that do not exploit the special structure of the least-squares problem or that use finite differences to approximate the gradients.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Trust-Region Methods. MPS-SIAM Series on Optimization. SIAM, Philadelphia (2000)

    Book  MATH  Google Scholar 

  2. Conn, A.R., Scheinberg, K., Vicente, L.N.: Global convergence of general derivative-free trust-region algorithms to first and second order critical points. SIAM J. Optim. 20, 387–415 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  3. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of interpolation sets in derivative free optimization. Math. Program. 111, 141–172 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  4. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. MPS-SIAM Series on Optimization. SIAM, Philadelphia (2009)

    Book  MATH  Google Scholar 

  5. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of sample sets in derivative free optimization: Polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28, 721–748 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  6. Dan, H., Yamashita, N., Fukushima, M.: Convergence properties of the inexact Levenberg-Marquardt method under local error bound. Optim. Methods Softw. 17, 605–626 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. Fan, J.: Convergence properties of a self-adaptive Levenberg-Marquardt algorithm under local error bound condition. Comput. Optim. Appl. 34, 47–62 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  8. Fan, J., Yuan, Y.: On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption. Computing 74, 23–39 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  9. Hager, W.W., Zhang, H.: Self-adaptive inexact proximal point methods. Comput. Optim. Appl. 39, 161–181 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  10. Hager, W.W., Zhang, H.: Asymptotic convergence analysis of a new class of proximal point methods. SIAM J. Control Optim. 46, 1683–1704 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  11. Hock, W., Schittkowski, K.: Test examples for nonlinear programming codes. Lect. Notes Econ. Math. Syst. 187 (1981)

  12. Moré, J.J.: The Levenberg-Marquardt algorithm, implementation and theory. In: Watson, G.A. (ed.) Numerical Analysis. Lecture Notes in Mathematics, vol. 630. Springer, Berlin (1977)

    Google Scholar 

  13. Powell, M.J.D.: Developments of NEWUOA for unconstrained minimization without derivatives. IMA J. Numer. Anal. 28, 649–664 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  14. Powell, M.J.D.: The NEWUOA software for unconstrained optimization without derivatives. DAMTP (2004)

  15. Powell, M.J.D.: Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Program., Ser. B 100, 183–215 (2004)

    Article  MATH  Google Scholar 

  16. Powell, M.J.D.: On trust region methods for unconstrained minimization without derivatives. Math. Program. 97, 605–623 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  17. Tseng, P.: Error bounds and superlinear convergence analysis of some Newton-type methods in optimization. In: Di Pillo, G., Giannessi, F. (eds.) Nonlinear Optimization and Related Topics, pp. 445–462. Kluwer Academic, Dordrecht (2000)

    Google Scholar 

  18. Yamashita, N., Fukushima, M.: The proximal point algorithm with genuine superlinear convergence for the monotone complementarity problem. SIAM J. Optim. 11, 364–379 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  19. Yamashita, N., Fukushima, M.: On the rate of convergence of the Levenberg-Marquardt method. Computing 15, 237–249 (2001)

    MathSciNet  Google Scholar 

  20. Zhang, H., Conn, A.R., Scheinberg, K.: A derivative-free algorithm for least-squares minimization. SIAM J. Optim. (2010, accepted)

  21. Zhang, H., Conn, A.R., Scheinberg, K.: A derivative-free algorithm for least-squares minimization with bound constraints (2010, in preparation)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongchao Zhang.

Additional information

This material is based upon work supported by the National Science Foundation under Grant 1016204.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhang, H., Conn, A.R. On the local convergence of a derivative-free algorithm for least-squares minimization. Comput Optim Appl 51, 481–507 (2012). https://doi.org/10.1007/s10589-010-9367-x

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-010-9367-x

Keywords

Navigation