Skip to main content
Log in

Jacobian-Free Implicit Inner-Iteration Preconditioner for Nonlinear Least Squares Problems

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

Nonlinear least squares (NLS) problems arise in many applications. The common solvers require to compute and store the corresponding Jacobian matrix explicitly, which is too expensive for large problems. Recently, some Jacobian-free (or matrix free) methods were proposed, but most of these methods are not really Jacobian free since the full or partial Jacobian matrix still needs to be computed in some iteration steps. In this paper, we propose an effective real Jacobian free method especially for large NLS problems, which is realized by the novel combination of using automatic differentiation for \(J(\mathbf{x})\mathbf{v}\) and \(J(\mathbf{x})^T\mathbf{v}\) along with the implicit iterative preconditioning ideas. Together, they yield a new and effective three-level iterative approach. In the outer level, the dogleg/trust region method is employed to solve the NLS problem. At each iteration of the dogleg method, we adopt the iterative linear least squares (LLS) solvers, CGLS or BA-GMRES method, to solve the LLS problem generated at each step of the dogleg method as the middle iteration. In order to accelerate the convergence of the iterative LLS solver, we propose an implicit inner iteration preconditioner based on the weighted Jacobi method. Compared to the existing Jacobian-free methods, our proposed three-level method need not compute any part of the Jacobian matrix explicitly in any iteration step. Furthermore, our method does not rely on the sparsity or structure pattern of the Jacobian, gradient or Hessian matrix. In other words, our method also works well for dense Jacobian matrices. Numerical experiments show the superiority of our proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Bellavia, S., Bertaccini, D., Morini, B.: Nonsymmetric preconditioner updates in Newton-Krylov methods for nonlinear systems. SIAM J. Sci. Comput. 33, 2595–2619 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bellavia, S., De Simone, V., Serafino, D., Morini, B.: Efficient preconditioner updates for shifted linear systems. SIAM J. Sci. Comput. 33, 1785–1809 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bellavia, S., Gondzio, J., Morini, B.: A matrix-free preconditioner for sparse symmetric positive definite systems and least-squares problems. SIAM J. Sci. Comput. 35, A192–A211 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  4. Björck, A.: Numerical methods for least squares problems. SIAM, Philadelphia, PA (1996)

    Book  MATH  Google Scholar 

  5. Bru, R., Marin, J., Mas, J., Tuma, M.: Preconditioned iterative methods for solving linear least squares problems. SIAM J Sci. Comput. 36, A2002–A2022 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  6. Cayuga Research Associates, LLC, ADMAT-2.0 User’s Guide, Cayuga Research Associates, New York (2009)

  7. Chen, Y., Shen, C.: A Jacobian-free Newton-GMRES(m) with adaptive preconditioners and its application for power flow calculations. IEEE Trans. Power Syst. 21, 1096–1103 (2006)

    Article  Google Scholar 

  8. Coleman, T.F., Li, Y., Verma, A.: Reconstructing the unknown local volatility function. J. Comput. Finance 2, 77–102 (1999)

    Article  MATH  Google Scholar 

  9. Coleman, T.F., Xu, W.: Fast (structured) Newton computations. SIAM Sci. Comput. 31, 1175–1191 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dennis, J.E., Schnabel, R.B.: Numerical methods for unconstrained optimization and nonlinear equations. SIAM, Philadelphia, PA (1996)

    Book  MATH  Google Scholar 

  11. Dixon, L.C.W., Price, R.C.: Truncated Newton method for sparse unconstrained optimization using automatic differentiation. J. Optim. Theory Appl. 60, 261–275 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  12. Golub, G.H., Van Loan, C.F.: Matrix computations, 3rd edn. Johns Hopkins University Press, Baltimore, MD (1996)

    MATH  Google Scholar 

  13. Griewank, A., Walther, A.: Evaluating derivatives: principles, and techniques of algorithmic differentiation, 2nd edn. SIAM, Philadelphia, PA (2008)

    Book  MATH  Google Scholar 

  14. Hayami, K., Yin, J.-F., Ito, T.: GMRES methods for least squares problems. SIAM J. Matrix Anal. Appl. 31, 2400–2430 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  15. Hestenes, M.K., Stiefel, E.: Methods of conjugate gradient for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  16. Imakura, A., Sakurai, T., Sumiyoshi, K., Matsufuru, H.: An auto-tuning technique of the weighted Jacobi-type iteration used for preconditioners of Krylov subspace methods. In: 2012 IEEE 6th international symposium on embeded multicore SoCs, pp. 183–190 (2012)

  17. Knoll, D.A., Keyes, D.E.: Jacobian-free Newton+-Krylov methods: a survey of approaches and applications. J. Comput. Phys. 193, 357–397 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  18. Madsen, K., Nielsen, H.B.: Introduction to optimization and data fitting, IMM, Technical University of Denmark, 2010. http://www2.imm.dtu.dk/pubdb/p.php?5938

  19. Morales, J., Nocedal, J.: Automatic preconditioning by limited memory Quasi-Newton updating. SIAM J. Optim. 10, 1079–1096 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  20. Morikuni, K., Hayami, K.: Inner-iteration Krylov subspace methods for least squares problems. SIAM J. Matrix Anal. Appl. 34, 1–22 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  21. Morikuni, K., Hayami, K.: Convergence of inner-iteration GMRES methods for rank-deficient least squares problems. SIAM J. Matrix Anal. Appl. 36, 225–250 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  22. Nash, S.G.: Preconditioning of truncated-Newton methods. SIAM J. Sci. Stat. Comput. 6, 599–616 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  23. Pawlowski, R.P., Simonis, J.P., Walker, H.F., Shadid, J.N.: Inexact Newton dogleg methods. SIAM J. Numer. Anal. 46(4), 2112–2132 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  24. Powell, M.J.D.: A hybrid method for nonlinear equations. In: Rabinowitz, P. (ed.) Numerical methods for nonlinear algebraic equations, pp. 87–114. Gordon and Breach, London (1970)

    Google Scholar 

  25. Saad, Y.: Iterative methods for sparse linear systems, 2nd edn. SIAM, Philadelphia (2003)

    Book  MATH  Google Scholar 

  26. Saad, Y., Schultz, M.: GMRES: a generalized minimal residual algorithm for solving nonsymmetric linear systems. SIMA J. Sci. Stat. Comput. 7, 856–869 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  27. Sonneveld, P., Van Gijzen, M.B.: IDR(s): a family of simple and fast algorithms for solving large nonsymmetric systems of linear equations. SIAM J. Sci. Comput. 31, 1035–1062 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  28. Tebbens, J.D., Túma, M.: Efficient preconditioning of sequences of nonsymmetric linear systems. SIAM J. Sci. Comput. 29, 1918–1941 (2007)

  29. Duintjer Tebbens, J., Túma, M.: Preconditioner updates for solving sequences of linear systems in matrix-free environment. Numer. Linear Algebra Appl. 17, 997–1019 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  30. Van der Sluis, A.: Condition numbers and equilibration of matrices. Numer. Math. 14, 14–23 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  31. www.autodiff.org (2012)

  32. Xu, W., Chen, X., Coleman, T.F.: The efficient application of automatic differentiation for computing gradients in financial applications. J. Comput. Finance (2014)

  33. Xu, W., Coleman, T.F., Liu, G.: A secant method for nonlinear least squares minimization. J. Comput. Optim. Appl. 51, 159–173 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  34. Xu, W., Coleman, T.: Efficient (partial) determination of derivative matrices via automatic differentiation. SIAM J. Sci. Comput. 35, A1398–A1416 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  35. Xu, W., Coleman, T.: Solving nonlinear equations with the Newton-Krylov method based on automatic differentiation. Optim. Methods Softw. 29, 88–101 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  36. Xu, W., Li, W.: Efficient preconditioners for Newton-GMRES method with application to power flow study. IEEE Trans. Power Syst. 28, 4173–4180 (2013)

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank Dr. Keiichi Morikuni and Mr. Kota Sugihara for useful discussions and Prof. Thomas Coleman for valuable comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wei Xu.

Additional information

This work was supported by the Fundamental Research Funds for the Central Universities in China and the MOU Grant of the National Institute of Informatics, Japan, and the Grant-in-Aid for Scientific Research (C) of the Ministry of Education, Culture, Sports, Science and Technology, Japan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, W., Zheng, N. & Hayami, K. Jacobian-Free Implicit Inner-Iteration Preconditioner for Nonlinear Least Squares Problems. J Sci Comput 68, 1055–1081 (2016). https://doi.org/10.1007/s10915-016-0167-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10915-016-0167-z

Keywords

Navigation