Abstract
The subgradient method is both a heavily employed and widely studied algorithm for non-differentiable optimization. Nevertheless, there are some basic properties of subgradient optimization that, while “well known” to specialists, seem to be rather poorly known in the larger optimization community. This note concerns two such properties, both applicable to subgradient optimization using the divergent series steplength rule. The first involves convergence of the iterative process, and the second deals with the construction of primal estimates when subgradient optimization is applied to maximize the Lagrangian dual of a linear program. The two topics are related in that convergence of the iterates is required to prove correctness of the primal construction scheme.
Similar content being viewed by others
References
Anstreicher, K.M., Wolsey, L.A.: On dual solutions in subgradient optimization. Center for Operations Research and Econometrics. Louvain-la-Neuve, Belgium (1992, working paper)
Bahiense L., Maculan N., Sagastizábal C. (2002). The volume algorithm revisited: relation with bundle methods. Math. Program. 94: 41–69
Barahona F., Anbil R. (2000). The volume algorithm: producing primal solutions with a subgradient method. Math. Program. 87: 385–399
Barahona F., Anbil R. (2002). On some difficult linear programs coming from set partitioning. Discret. Appl. Math. 118: 3–11
Correa R., Lemaréchal C. (1993). Convergence of some algorithms for convex minimization. Math. Program. 62: 261–275
Dubost L., Gonzalez R., Lemaréchal C. (2005). A primal-proximal heuristic applied to the French unit-commitment problem. Math. Program. 104: 129–151
Ermol’ev Yu. (1976). Methods of stochastic programming. Nauka, Moscow
Goffin J.L. (1977). On the convergence rates of subgradient optimization methods. Math. Program. 13: 329–347
Held M., Wolfe P., Crowder H. (1974). Validation of subgradient optimization. Math. Program. 6: 62–88
Larsson T., Liu Z. (1997). A Lagrangian relaxation scheme for structured linear programs with application to multicommodity network flow. Optimization 40: 247–284
Larsson T., Patriksson M., Strömberg A.-B. (1996). Conditional subgradient optimization—theory and applications. Eur. J. Oper. Res. 88: 382–403
Larsson T., Patriksson M., Strömberg A.-B. (1998). Ergodic convergence in subgradient optimization. Optim. Methods Softw. 9: 93–120
Larsson T., Patriksson M., Strömberg A.-B. (1999). Ergodic, primal convergence in dual subgradient schemes for convex programming. Math. Program. 86: 283–312
Lemaréchal C.: (2001). Lagrangian relaxation. In: Jünger, M., Nadef, D. (eds) Computational Combinatorial Optimization, pp 112–156. Springer, Heidelberg
Nemirovskii, A.: Private communication (1993)
Polyak B.T. (1967). A general method for solving extremum problems. Soviet Math Doklady 8: 593–597
Polyak B.T. (1977). Subgradient methods: a survey of Soviet research. In: Lemaréchal, C.L., Mifflin, R. (eds) Nonsmooth Optimization, Proceedings of a IIASA Workshop, March 28–April 8, 1977. Pergamon Press, New York
Polyak B.T. (1987). Introduction to Optimization. Optimization Software, Inc., New York
Rudin W. (1976). Principles of Mathematical Analysis, 3rd edn. McGraw-Hill, New York
Shepilov M.A. (1976). Method of the generalized gradient for finding the absolute minimum of a convex function. Cybernetics 12: 547–553
Sherali H.D., Choi G. (1996). Recovery of primal solutions when using subgradient optimization methods to solve Lagrangian duals of linear programs. Oper. Res. Lett. 19: 105–113
Shor N.Z. (1985). Minimization Methods for Non-Differentiable Functions. Springer, Berlin
Author information
Authors and Affiliations
Corresponding author
Additional information
Dedicated to B.T. Polyak on the occassion of his 70th birthday.
Rights and permissions
About this article
Cite this article
Anstreicher, K.M., Wolsey, L.A. Two “well-known” properties of subgradient optimization. Math. Program. 120, 213–220 (2009). https://doi.org/10.1007/s10107-007-0148-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-007-0148-y