Skip to main content
Log in

An analysis of reduced Hessian methods for constrained optimization

  • Published:
Mathematical Programming Submit manuscript

Abstract

We study the convergence properties of reduced Hessian successive quadratic programming for equality constrained optimization. The method uses a backtracking line search, and updates an approximation to the reduced Hessian of the Lagrangian by means of the BFGS formula. Two merit functions are considered for the line search: the 1 function and the Fletcher exact penalty function. We give conditions under which local and superlinear convergence is obtained, and also prove a global convergence result. The analysis allows the initial reduced Hessian approximation to be any positive definite matrix, and does not assume that the iterates converge, or that the matrices are bounded. The effects of a second order correction step, a watchdog procedure and of the choice of null space basis are considered. This work can be seen as an extension to reduced Hessian methods of the well known results of Powell (1976) for unconstrained optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  • D.P. Bertsekas, “Variable metric methods for constrained optimization using differentiable exact penalty functions,”Proceedings of the Eighteenth Allerton Conference on Communication, Control and Computing (University of Illinois Press, Champaign, IL, 1980) pp. 584–593.

    Google Scholar 

  • P.T. Boggs and J.W. Tolle, “A family of descent functions for constrained optimization,”SIAM Journal on Numerical Analysis 21 (1984) 1146–1161.

    Google Scholar 

  • P.T. Boggs and J.W. Tolle, “Extensions to the theory and efficient utilization of a merit function for nonlinear programming,” Technical Report 85-5, Curriculum in Operations Research and Systems Analysis, University of North Carolina (Chapel Hill, 1985).

    Google Scholar 

  • R.H. Byrd, “An example of irregular convergence in some constrained optimization methods that use the projected Hessian,”Mathematical Programming 32 (1985) 232–237.

    Google Scholar 

  • R.H. Byrd, “On the convergence of constrained optimization methods with accurate Hessian information on a subspace,”SIAM Journal on Numerical Analysis 27 (1990) 141–153.

    Google Scholar 

  • R.H. Byrd and J. Nocedal, “A tool for the analysis of quasi-Newton methods with application to unconstrained minimization,”SIAM Journal on Numerical Analysis 26 (1989) 727–739.

    Google Scholar 

  • R.H. Byrd, J. Nocedal and Y. Yuan, “Global convergence of a class of quasi-Newton methods on convex problems,”SIAM Journal on Numerical Analysis 24 (1987) 1171–1190.

    Google Scholar 

  • R.H. Byrd and R.B. Schnabel, “Continuity of the null space basis and constrained optimization,”Mathematical Programming 35 (1986) 32–41.

    Google Scholar 

  • R.M. Chamberlain, C. Lemarechal, H.C. Pedersen and M.J.D. Powell, “The watchdog technique for forcing convergence in algorithms for constrained optimization,”Mathematical Programming Studies 16 (1982) 1–17.

    Google Scholar 

  • T.F. Coleman and A.R. Conn, “Nonlinear programming via an exact penalty function: global analysis,”Mathematical Programming 24 (1982) 137–161.

    Google Scholar 

  • T.F. Coleman and A.R. Conn, “On the local convergence of a quasi-Newton method for the nonlinear programming problem,”SIAM Journal on Numerical Analysis 21 (1984) 755–769.

    Google Scholar 

  • T.F. Coleman and D. Sorensen, “A note on the computation of an orthonormal basis for the null space of a matrix,”Mathematical Programming 29 (1984) 234–242.

    Google Scholar 

  • J.E. Dennis, Jr. and J.J. Moré, “Quasi-Newton methods, motivation and theory,”SIAM Review 19 (1977) 46–89.

    Google Scholar 

  • P. Fenyes, “Partitioned quasi-Newton methods for nonlinear equality constrained optimization,” Ph.D. Dissertation, Department of Computer Science, Cornell University (1987).

  • R. Fletcher, “A class of methods for nonlinear programming with termination and convergence properties,” in: J. Abadie, ed.,Integer and Nonlinear Programming (North-Holland, Amsterdam, 1970).

    Google Scholar 

  • R. Fletcher, “An exact penalty for nonlinear programming with inequalities,”Mathematical Programming 5 (1973) 129–150.

    Google Scholar 

  • R. Fletcher, “Second order corrections for nondifferentiable optimization,” in: G.A. Watson, ed.,Numerical Analysis, Lecture Notes in Mathematics, No. 912 (Springer, Berlin, 1982).

    Google Scholar 

  • R. Fletcher, “A first derivative method for nonlinear programming based on successive 1 LP,” Numerical Analysis Report NA/114, Department of Mathematics and Computer Science, University of Dundee (Dundee, 1988).

    Google Scholar 

  • R. Fontecilla, “Local convergence of secant methods for nonlinear constrained optimization,”SIAM Journal on Numerical Analysis 25 (1988) 692–712.

    Google Scholar 

  • D. Gabay, “Reduced quasi-Newton methods with feasibility improvement for nonlinearly constrained optimization,”Mathematical Programming Study 16 (1982) 18–44.

    Google Scholar 

  • J.C. Gilbert, “Maintaining the positive definiteness of the matrices in reduced Hessian methods for equality constrained optimization,” International Institute for Applied System Analysis Technical Report WP-87-123 (Laxenburg, Austria, 1987).

    Google Scholar 

  • P.E. Gill, W. Murray, M.A. Saunders, G.W. Stewart and M.H. Wright, “Properties of a representation of a basis for the null space,”Mathematical Programming 33 (1985) 172–186.

    Google Scholar 

  • S.T. Glad, “Properties of updating methods for the multipliers in augmented Lagrangians,”Journal of Optimization Theory and Applications 28 (1979) 135–156.

    Google Scholar 

  • S.P. Han, “Superlinearly convergent variable metric algorithms for general nonlinear programming problems,”Mathematical Programming 11 (1976) 263–282.

    Google Scholar 

  • S.P. Han, “A globally convergent method for nonlinear programming,”Journal on Optimization Theory and Application 22 (1977) 297–309.

    Google Scholar 

  • D.Q. Mayne and E. Polak, “A superlinearly convergent algorithm for constrained optimization problems,”Mathematical Programming Studies 16 (1982) 45–61.

    Google Scholar 

  • W. Murray and M.H. Wright, “Projected Lagrangian methods based on the trajectories of penalty and barrier functions,” Systems Optimization Laboratory Report 78-23, Stanford University (Stanford, CA, 1978).

    Google Scholar 

  • J. Nocedal and M.L. Overton, “Projected Hessian updating algorithms for nonlinearly constrained optimization,”SIAM Journal on Numerical Analysis 22 (1985) 821–850.

    Google Scholar 

  • M.J.D. Powell, “Some global convergence properties of a variable metric algorithm for minimization without exact line searches,” in: R.W. Cottle and C.E. Lemke, eds.,Nonlinear Programming, SIAMAMS Proceedings, Vol. IX (Society for Industrial and Applied Mathematics, Philadelphia, 1976.)

    Google Scholar 

  • M.J.D. Powell, “The convergence of variable metric methods for nonlinearly constrained optimization calculations,” in: O. Mangasarian, R. Meyer and S. Robinson, eds.,Nonlinear Programming 3 (Academic Press, New York and London, 1978) pp. 27–63.

    Google Scholar 

  • M.J.D. Powell, “Methods for nonlinear constraints in optimization calculations” in:Proceedings of the 1986 IMA/SIAM Meeting on “The State of the Art in Numerical Analysis” (Clarendon Press, Oxford, 1987).

    Google Scholar 

  • M.J.D. Powell and Y. Yuan, “A recursive quadratic programming algorithm that uses differentiable exact penalty functions,”Mathematical Programming 35 (1986) 265–278.

    Google Scholar 

  • R.A. Tapia, “Diagonalized multiplier methods and quasi-Newton methods for constrained optimization,”Journal on Optimization Theory and Application 22 (1977) 135–194.

    Google Scholar 

  • Y. Yuan, “An only 2-step Q-superlinear convergence example for some algorithms that use reduced Hessian approximations,”Mathematical Programming 32 (1985) 224–231.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

This author was supported, in part, by National Science Foundation grant CCR-8702403, Air Force Office of Scientific Research grant AFOSR-85-0251, and Army Research Office contract DAAL03-88-K-0086.

This author was supported by the Applied Mathematical Sciences subprogram of the Office of Energy Research, U.S. Department of Energy, under contracts W-31-109-Eng-38 and DE FG02-87ER25047, and by National Science Foundation Grant No. DCR-86-02071.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Byrd, R.H., Nocedal, J. An analysis of reduced Hessian methods for constrained optimization. Mathematical Programming 49, 285–323 (1990). https://doi.org/10.1007/BF01588794

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01588794

Key words

Navigation