Skip to main content
Log in

Convergence study on the proximal alternating direction method with larger step size

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

The alternating direction method of multipliers (ADMM) is a popular method for solving separable convex programs with linear constraints, and its proximal version is an important variant. In the literature, Fortin and Glowinski proved that the step size for updating the Lagrange multiplier of the ADMM can be chosen in the open interval of zero to the golden ratio, and subsequently this result has been proved to be also valid for the proximal ADMM. In this paper, we demonstrate that the dual step size can be larger than the golden ratio when the proximal regularization is positive definite. Thus, the feasible interval of the dual step size can be further enlarged for the proximal ADMM. Moreover, we establish the exact relationship between the dual step size and the proximal parameter. We also prove global convergence and establish a worst case convergence rate in the ergodic sense for this proximal scheme with the enlarged step size. Finally, we present numerical results to demonstrate the practical performance of the method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)

    Article  MathSciNet  Google Scholar 

  2. Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Fletcher, R (ed.) Optimization, pp 283–298. Academic Press, New York (1969)

  3. Chan, T.F., Glowinski, R.: Finite element approximation and iterative solution of a class of mildly nonlinear elliptic equations. Stanford report STAN-CS-78-674, Computer Science Department, Stanford University, Palo Alto CA (1978)

  4. Glowinski, R., Marroco, A.: Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité d’une classe de problèmes de Dirichlet non linéaires. Revue franċaise d’automatique, informatique, recherche opérationnelle. Analyse numérique 9(2), 41–76 (1975)

    Article  MathSciNet  Google Scholar 

  5. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)

    Article  Google Scholar 

  6. Eckstein, J., Yao, W.: Understanding the convergence of the alternating direction method of multipliers: Theoretical and computational perspectives. Pac. J. Optim. 11(4), 619–644 (2015)

    MathSciNet  MATH  Google Scholar 

  7. Glowinski, R.: On alternating directionmethods of multipliers: a historical perspective. In: Fitzgibbon, W., Kuznetsov, Y.A., Neittaanmaki, P., Pironneau, O. (eds.) Modeling, Simulation and Optimization for Science and Technology. Computational Methods in Applied Sciences, vol. 34, pp 59–82. Springer, Dordrecht (2014)

  8. Glowinski, R., Pan, T.W., Tai, X.C.: Some facts about operator-splitting and alternating direction methods. In: Glowinski, R., Osher, S.J., Yin, W. (eds.) Splitting Methods in Communication, Imaging, Science, and Engineering, pp 19–94. Springer, New York (2016)

  9. Glowinski, R., Osher, S.J., Yin, W. (eds.): Splitting Methods in Communication, Imaging, Science, and Engineering. Springer, New York (2016)

    MATH  Google Scholar 

  10. Cai, X.J., Gu, G.Y., He, B.S., Yuan, X.: A proximal point algorithm revisit on the alternating direction method of multipliers. Sci. China Math. 56(10), 2179–2186 (2013)

    Article  MathSciNet  Google Scholar 

  11. Ma, S.Q.: Alternating proximal gradient method for convex minimization. J. Sci. Comput. 68, 546–572 (2016)

    Article  MathSciNet  Google Scholar 

  12. Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55 (1-3), 293–318 (1992)

    Article  MathSciNet  Google Scholar 

  13. He, B.S., Liu, H., Wang, Z.R., Yuan, X.: A strictly contractive Peaceman–Rachford splitting method for convex programming. SIAM J. Optim. 24 (3), 1011–1040 (2014)

    Article  MathSciNet  Google Scholar 

  14. Ma, F.: On relaxation of some customized proximal point algorithms for convex minimization: from variational inequality perspective. Comput. Optim. Appl. 73(3), 871–901 (2019)

    Article  MathSciNet  Google Scholar 

  15. He, B., Ma, F., Yuan, X.: Convergence study on the symmetric version of ADMM with larger step sizes. SIAM J. Imaging Sci. 9(3), 1467–1501 (2016)

    Article  MathSciNet  Google Scholar 

  16. Fortin, M., Glowinski, R.: Méthodes de Lagrangien Augmenté: Application à la Résolution Numérique de Problèmes aux Limites, Dunod, Paris (1982)

  17. Fortin, M., Glowinski, R.: Augmented Lagrangian Methods: Applications to the Numerical Solution of Boundary-Value Problems, Amsterdam the Netherlands (1983)

  18. Glowinski, R.: Numerical Methods for Nonlinear Variational Problems. Springer, New York (1984)

    Book  Google Scholar 

  19. Wen, Z.W., Goldfarb, D., Yin, W.T.: Alternating direction augmented Lagrangian methods for semidefinite programming. Math. Program. Comput. 2, 203–230 (2010)

    Article  MathSciNet  Google Scholar 

  20. Chen, C.H., He, B.S., Yuan, X.: Matrix completion via alternating direction method. IMA J. Numer. Analy. 32, 227–245 (2012)

    Article  MathSciNet  Google Scholar 

  21. He, B.S., Xu, M.H., Yuan, X.: Solving large-scale least squares covariance matrix problems by alternating direction methods. SIAM J. Matrix Anal. Appli. 32, 136–152 (2011)

    Article  Google Scholar 

  22. Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite-element approximations. Comput. Math. Appli. 2, 17–40 (1976)

    Article  Google Scholar 

  23. Tao, M., Yuan, X.: On Glowinski’s open question on the alternating direction method of multipliers. J. Optim. Theory Appl. 179, 163–196 (2018)

    Article  MathSciNet  Google Scholar 

  24. Eckstein, J.: Some saddle-function splitting methods for convex programming. Optim. Methods Soft. 4, 75–83 (1994)

    Article  MathSciNet  Google Scholar 

  25. He, B.S., Liao, L.Z., Han, D.R., Yang, H.: A new inexact alternating directions method for monotone variational inequalities. Math. Program. 92, 103–118 (2002)

    Article  MathSciNet  Google Scholar 

  26. Yang, J., Yuan, X.: Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization. Math. Comput. 82(281), 301–329 (2013)

    Article  MathSciNet  Google Scholar 

  27. Deng, W., Yin, W.T.: On the global and linear convergence of the generalized alternating direction method of multipliers. J. Sci. Comput. 66, 889–916 (2016)

    Article  MathSciNet  Google Scholar 

  28. Zhang, X.Q., Burger, M., Osher, S.: A unified primal-dual algorithm framework based on Bregman iteration. J. Sci. Comput. 96, 20–46 (2011)

    Article  MathSciNet  Google Scholar 

  29. Chan, R.H., Tao, M., Yuan, X.: Linearized alternating direction method of multipliers for constrained linear least-squares problem. East Asian J. Appl. Math. 2, 326–341 (2012)

    Article  MathSciNet  Google Scholar 

  30. Wang, X., Yuan, X.: The linearized alternating direction method of multipliers for dantzig selector. SIAM J. Sci. Comput. 34(5), A2792–A2811 (2012)

    Article  MathSciNet  Google Scholar 

  31. Xu, M.H.: Proximal alternating directions method for structured variational inequalities. J. Optim. Theory Appl. 134, 107–117 (2007)

    Article  MathSciNet  Google Scholar 

  32. Gonçalves, M. L. N., Melo, J.G., Monteiro, R.D.C.: Extending the ergodic convergence rate of the proximal ADMM. arXiv:1611.02903 (2016)

  33. Wright, J., Ganesh, A., Min, K., Ma, Y.: Compressive principal component pursuit. Information and Inference: A Journal of the IMA 2(1), 32–68 (2013)

    Article  MathSciNet  Google Scholar 

  34. Chambolle, A., Pock, T.: An introduction to continuous optimization for imaging. Acta Numerica 25, 161–319 (2016)

    Article  MathSciNet  Google Scholar 

  35. Sra, S., Nowozin, S., Wright, S.J.: Optimization for machine learning. Mit Press (2012)

  36. Gonċalves, M. L. N., Melo, J.G., Monteiro, R.D.C.: Convergence rate bounds for a proximal ADMM with over-relaxation stepsize parameter for solving nonconvex linearly constrained problems. arXiv:1702.01850 (2017)

  37. Gu, Y, Jiang, B, Han, D.: A semi-proximal-based strictly contractive Peaceman-Rachford splitting method. arXiv:1506.02221 (2015)

  38. Gonçalves, M.L.N.: On the pointwise iteration-complexity of a dynamic regularized ADMM with over-relaxation stepsize. Appl. Math. Comput. 336, 315–325 (2018)

    MathSciNet  MATH  Google Scholar 

  39. Gonçalves, M.L.N., Alves, M.M., Melo, J.G.: Pointwise and ergodic convergence rates of a variable metric proximal alternating direction method of multipliers. J. Optim. Theory Appl. 177(2), 448–478 (2018)

    Article  MathSciNet  Google Scholar 

  40. Facchinei, F, Pang, J S: Finite-Dimensional Variational Inequalities and Complementarity Problems, vol. 1, Springer Series in Operations Research. Springer, New York (2003)

    MATH  Google Scholar 

  41. He, B., Yuan, X.: On the O(1/n) convergence rate of the Douglas-Rachford alternating direction method. SIAM J. Num. Anal. 50(2), 700–709 (2012)

    Article  MathSciNet  Google Scholar 

  42. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Royal Stat. Soc., Series B. 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  43. He, B., Ma, F., Yuan, X.: Optimal proximal augmented Lagrangian method and its application to full Jacobian splitting for multi-block separable convex minimization problems. IMA J. Num. Anal. https://doi.org/10.1093/imanum/dry092 (2019)

  44. He, B., Ma, F., Yuan, X.: Optimally linearizing the alternating direction method of multipliers for convex programming. Comput. Optim. Appl. https://doi.org/10.1007/s10589-019-00152-3 (2017)

  45. Gao, B., Ma, F.: Symmetric alternating direction method with indefinite proximal regularization for linearly constrained convex optimization. J. Optim. Theory Appl. 176(1), 178–204 (2018)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

The author is grateful to the anonymous referees and the editor for their valuable comments and suggestions which have helped us improve the presentation of this paper. He would like to thank Professor Bingsheng He for fruitful discussions and suggestions regarding this project and thank Professor Shiqian Ma for providing the SPCP codes.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Feng Ma.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This author was supported by the NSFC Grant 11701564 and 11871029.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ma, F. Convergence study on the proximal alternating direction method with larger step size. Numer Algor 85, 399–425 (2020). https://doi.org/10.1007/s11075-019-00819-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-019-00819-2

Keywords

Mathematics Subject Classification (2010)

Navigation