Abstract
We study notions of nondegeneracy and several levels of increasing degeneracy from the perspective of the local behavior of a local solution of a nonlinear program when problem parameters are slightly perturbed. Ideal nondegeneracy at a local minimizer is taken to mean satisfaction of second order sufficient conditions, linear independence and strict complimentary slackness. Following a brief exploration of the relationship of these conditions with the classical definition of nondegeneracy in linear programming, we recall a number of optimality and regularity conditions used to attempt to resolve degeneracy and survey results of Fiacco, McCormick, Robinson, Kojima, Gauvin and Janin, Shapiro, Kyparisis and Liu. This overview may be viewed as a structured survey of sensitivity and stability results: the focus is on progressive levels of degeneracy. We note connections of nondegeneracy with the convergence of algorithms and observe the striking parallel between the effects of nondegeneracy and degeneracy on optimality conditions, stability analysis and algorithmic convergence behavior. Although our orientation here is primarily interpretive and noncritical, we conclude that more effort is needed to unify optimality, stability and convergence theory and more results are needed in all three areas for radically degenerate problems.
Similar content being viewed by others
References
W. Alt, Lipschitzian perturbation of infinite optimization problems, in:Mathematical Programming with Data Perturbation II, ed. A.V. Fiacco (New York, 1983).
A. Auslender and R. Cominetti, First and second-order sensitivity analysis of nonlinear programs under directional constraint qualification conditions, Optimization 21 (1990) 351.
A.V. Fiacco and G.P. McCormick,Nonlinear Programming: Sequential Unconstrained Minimization Techniques (Wiley, New York, 1968).
A.V. Fiacco,Introduction to Sensitivity and Stability Analysis in Nonlinear Programming (Academic Press, New York, 1983).
A.V. Fiacco, Sensitivity analysis for nonlinear programming using penalty methods, Math. Progr. 10 (1976) 287.
A.V. Fiacco and Y. Ishizuka, Sensitivity and stability analysis for nonlinear programming, Ann. Oper. Res. 27 (1990) 215.
J. Gauvin and R. Janin, Directional behavior of optimal solutions in nonlinear mathematical programming, Math. Oper. Res. 13 (1988) 629.
N.H. Josephy, Newton's method for generalized equations, Technical Summary Report No. 1965, Mathematics Research Center, University of Wisconsin-Madison (1979).
N.H. Josephy, Quasi-Newton methods for generalized equations, Technical Summary Report No. 1966, Mathematics Research Center, University of Wisconsin-Madison (1979).
M. Kojima, Strongly stable stationary solutions in nonlinear programs, in:Analysis and Computation of Fixed Points, ed. S.M. Robinson (Academic press, New York, 1980).
J. Kyparisis, On uniqueness of Kuhn-Tucker multipliers in nonlinear programming, Math. Progr. 32 (1985) 242.
J. Kyparisis, Sensitivity analysis framework for variational inequalities, Math. Progr. 38 (1987) 203.
J. Kyparisis, Sensitivity analysis for nonlinear programs and variational inequalities with nonunique multipliers, Math. Oper. Res. 15 (1990) 286.
J. Liu, Linear stability of generalized equations, Part I: basic theory, Technical paper T-548, Department of Operations Research, George Washington University, Washington, DC (1991).
J. Liu, Linear stability of generalized equations, Part II: applications to nonlinear programming, Technical paper T-549, Department of Operations Research, George Washington University, Washington, DC (1991).
J. Liu, Regular generalized equations, Technical paper T-551, Department of Operations Research, George Washington University, Washington, DC (1991).
J. Liu, Directional differentiability of solutions in nonlinear programs and variational inequalities with nonunique multipliers, Technical paper T-552, Department of Operations Research, George Washington University, Washington, DC (1991).
J. Liu, Solution Holder continuity for nonlinear programs, A Technical Note, Department of Operations Research, George Washington University, Washington, DC (1991).
G.P. McCormick, Optimality criteria in nonlinear programming, in:SIAM-AMS Proc. 9 (SIAM, Philadelphia, 1975) p. 27.
M.J.D. Powell, Variable metric methods for constrained optimization, in:Mathematical Programming: The State of the Art, eds. A. Bachem, M. Grötschel and B. Korte (Springer, Berlin, 1982).
S.M. Robinson, Perturbed Kuhn-Tucker points and rates of convergence for a class of nonlinear programming algorithms, Math. Progr. 7 (1974) 1.
S.M. Robinson, Strongly regular generalized equations, Math. Oper. Res. 5 (1980) 43.
S.M. Robinson, Generalized equations and their solutions, Part II: applications to nonlinear programming, Math. Progr. Study 19 (1982) 200.
R.T. Rockafellar, Directional differentiability of the optimal value function in a nonlinear programming problem, Math. Progr. Study 21 (1984) 213.
E. Sachs, Rates of convergence for adaptive Newton methods. J. Optim. Appl. 48 (1986) 175.
A. Shapiro, Second order sensitivity analysis and asymptotic theory of parameterized nonlinear programs, Math. Progr. 33 (1985) 280.
A. Shapiro, Sensitivity analysis of nonlinear programs and differentiability properties of metric projection, SIAM J. Control Optim. 26 (1988) 628.
C. Wang and F. Zhao, On the directional derivatives of optimal value functions in mathematical programming and its applications, Preprint, Institute of Operations Research, Qufu Normal University, Shandong, P.R. China (1991).
G.P. McCormick,Nonlinear Programming: Theory, Algorithms, and Application (Wiley, 1983).
Author information
Authors and Affiliations
Additional information
Research supported by National Science Foundation Grant ECS 90-00560 and Grant N00014-89-J-1537 Office of Naval Research
Rights and permissions
About this article
Cite this article
Fiacco, A.V., Liu, J. Degeneracy in NLP and the development of results motivated by its presence. Ann Oper Res 46, 61–80 (1993). https://doi.org/10.1007/BF02096257
Issue Date:
DOI: https://doi.org/10.1007/BF02096257