Abstract
In an optimization problem with equality constraints the optimal value function divides the state space into two parts. At a point where the objective function is less than the optimal value, a good iteration must increase the value of the objective function. Thus, a good iteration must be a balance between increasing or decreasing the objective function and decreasing a constraint violation function. This implies that at a point where the constraint violation function is large, we should construct noninferior solutions relative to points in a local search region. By definition, an accessory function is a linear combination of the objective function and a constraint violation function. We show that a way to construct an acceptable iteration, at a point where the constraint violation function is large, is to minimize an accessory function. We develop a two-phases method. In Phase I some constraints may not be approximately satisfied or the current point is not close to the solution. Iterations are generated by minimizing an accessory function. Once all the constraints are approximately satisfied, the initial values of the Lagrange multipliers are defined. A test with a merit function is used to determine whether or not the current point and the Lagrange multipliers are both close to the optimal solution. If not, Phase I is continued. If otherwise, Phase II is activated and the Newton method is used to compute the optimal solution and fast convergence is achieved.
Similar content being viewed by others
References
Goh, B.S.: Greatest descent algorithms in unconstrained optimization. J. Optim. Theory Appl. 142, 275–289 (2009)
Zadeh, L.A.: Optimality and non-scalar-valued performance criteria. IEEE Trans. Autom. Control 8, 59–60 (1963)
Vincent, T.L., Leitmann, G.: Control space properties of cooperative games. J. Optim. Theory Appl. 6, 91–116 (1970)
Ho, Y.C.: Differential games, dynamic optimization and generalized control theory. J. Optim. Theory Appl. 6, 179–209 (1970)
Giannessi, F.: Constrained Optimization and Image Space Analysis, vol. 1: Separation of Sets and Optimality Conditions. Springer, New York (2005)
Miele, A., Moseley, E., Levy, A.V., Coggins, G.M.: On the method of multiplier for mathematical programming problems. J. Optim. Theory Appl. 10, 1–33 (1972)
Fletcher, R., Leyffer, S., Toint, Ph.L.: On the global convergence of a filter-SQP algorithm. SIAM J. Optim. 13, 44–59 (2002)
Fletcher, R., Leyffer, S., Toint, P.L.: A brief history of filter methods. SIAG/Optim. Views News 18, 2–12 (2007)
Goh, B.S.: Algorithms for unconstrained optimization problems via control theory. J. Optim. Theory Appl. 92, 581–604 (1997)
Fiacco, A.V., McCormick, G.P.: Nonlinear Programming: Sequential Unconstrained Minimization Techniques. Wiley, New York (1968)
LaSalle, J.P.: The Stability of Dynamical Systems. SIAM, Philadelphia (1976)
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by N.G. Medhin.
The author is grateful to Professors Bingsheng He and Mark Wu, for support and help in this research and Professor F. Giannessi for many helpful comments.
Rights and permissions
About this article
Cite this article
Goh, B.S. Approximate Greatest Descent Methods for Optimization with Equality Constraints. J Optim Theory Appl 148, 505–527 (2011). https://doi.org/10.1007/s10957-010-9765-3
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-010-9765-3