Skip to main content
Log in

Objective-derivative-free methods for constrained optimization

  • Published:
Mathematical Programming Submit manuscript

Abstract.

We propose feasible descent methods for constrained minimization that do not make explicit use of the derivative of the objective function. The methods iteratively sample the objective function value along a finite set of feasible search arcs and decrease the sampling stepsize if an improved objective function value is not sampled. The search arcs are obtained by projecting search direction rays onto the feasible set and the search directions are chosen such that a subset approximately generates the cone of first-order feasible variations at the current iterate. We show that these methods have desirable convergence properties under certain regularity assumptions on the constraints. In the case of linear constraints, the projections are redundant and the regularity assumptions hold automatically. Numerical experience with the methods in the linearly constrained case is reported.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Received: November 12, 1999 / Accepted: April 6, 2001¶Published online October 26, 2001

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lucidi, S., Sciandrone, M. & Tseng, P. Objective-derivative-free methods for constrained optimization. Math. Program. 92, 37–59 (2002). https://doi.org/10.1007/s101070100266

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s101070100266

Navigation