Skip to main content
Log in

Convergence Analysis of the Generalized Splitting Methods for a Class of Nonconvex Optimization Problems

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose generalized splitting methods for solving a class of nonconvex optimization problems. The new methods are extended from the classic Douglas–Rachford and Peaceman–Rachford splitting methods. The range of the new step-sizes even can be enlarged two times for some special cases. The new methods can also be used to solve convex optimization problems. In particular, for convex problems, we propose more relax conditions on step-sizes and other parameters and prove the global convergence and iteration complexity without any additional assumptions. Under the strong convexity assumption on the objective function, the linear convergence rate can be derived easily.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Li, G., Pong, T.K.: Douglas–Rachford splitting for nonconvex optimization with application to nonconvex feasibility problems. Math. Program. 159, 371–401 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  2. Combettes, P.L., Pesquet, J.-C.: A Douglas–Rachford splitting approach to nonsmooth convex variational signal recovery. IEEE J. Sel. Top. Signal Process. 1, 564–574 (2007)

    Article  Google Scholar 

  3. Douglas, J., Rachford, H.H.: On the numerical solution of heat conduction problems in two or three space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)

    Article  MathSciNet  MATH  Google Scholar 

  4. Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55, 293–318 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  5. Gandy, S., Recht, B., Yamada, I.: Tensor completion and low-\(n\)-rank tensor recovery via convex optimization. Inverse Probl. 27, 025010 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  6. Han, D.R., He, H.J., Yang, H., Yuan, X.M.: A customized Douglas–Rachford splitting algorithm for separable convex minimization with linear constraints. Numer. Math. 127(1), 167–200 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  7. He, B.S., Yuan, X.M.: On the O(\(1/n\)) convergence rate of the Douglas–Rachford alternating direction method. SIAM J. Numer. Anal. 50, 700–709 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  8. He, B.S., Yuan, X.M.: On the convergence rate of Douglas-Rachford operator splitting method. Math. Program. 153(2), 715–722 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  9. Lions, P.-L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16, 964–979 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  10. Li, G., Liu, T., Pong, T.K.: Peaceman–Rachford splitting for a class of nonconvex optimization problems. Comput. Optim. Appl. 68(2), 407–436 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  11. Li, M., Yuan, X.M.: A strictly contractive Peaceman–Rachford splitting method with logarithmic-quadratic proximal regularization for convex programming. Math. Oper. Res. 40(4), 842–858 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  12. Peaceman, D.W., Rachford, H.H.: The numerical solution of parabolic and elliptic differential equations. J. Soc. Ind. Appl. Math. 3, 28–41 (1955)

    Article  MathSciNet  MATH  Google Scholar 

  13. Wu, Z.M., Liu, F.X., Li, M.: A proximal Peaceman–Rachford splitting method for solving the multi-block separable convex minimization problems. Int. J. Comput. Math. 96(4), 708–728 (2019)

    Article  MathSciNet  Google Scholar 

  14. Davis, D., Yin, W.T.: Faster convergence rates of relaxed Peaceman–Rachford and ADMM under regularity assumptions. Math. Oper. Res. 42(3), 783–805 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  15. He, B.S., Liu, H., Wang, Z.R., Yuan, X.M.: A strictly contractive Peaceman-Rachford splitting method for convex programming. SIAM J. Optim. 24(3), 1011–1040 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  16. Bauschke, H.H., Moursi, W.M.: On the Douglas–Rachford algorithm. Math. Program. 164(1–2), 263–284 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  17. Bauschke, H.H., Borwein, J.M.: On the convergence of von Neumann’s alternating projection algorithm for two sets. Set Valued Anal. 1(2), 185–212 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  18. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York (2011)

    Book  MATH  Google Scholar 

  19. Themelis, A., Patrinos, P.: Douglas–Rachford splitting and ADMM for nonconvex optimization: tight convergence results (2018). arXiv:1709.05747v4

  20. Li, G., Pong, T.K.: Global convergence of splitting methods for nonconvex composite optimization. SIAM J. Optim. 25(4), 2434–2460 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  21. Guo, K., Han, D.R., Yuan, X.M.: Convergence analysis of Douglas–Rachford splitting method for “strongly + weakly” convex programming. SIAM J. Numer. Anal. 55(4), 1549–1577 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  22. Lindstrom, S.B., Sims, B.: Survey: sixty years of Douglas–Rachford (2018). arXiv:1809.07181

  23. Artacho, F.J.A., Borwein, J.M., Tam, M.K.: Global behavior of the Douglas–Rachford method for a nonconvex feasibility problem. J. Glob. Optim. 65(2), 309–327 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  24. Bogdan, M., Berg, E.V.D., Su, W., Candès, E.: Statistical estimation and testing via the sorted \(L_1\) norm (2013). arXiv:1310.1969

  25. Bruckstein, A.M., Donoho, D.L., Elad, M.: From sparse solutions of systems of equations to sparse modeling of signals and images. SIAM Rev. 51, 34–81 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  26. Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96, 1348–1360 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  27. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. B 58, 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  28. Attouch, H., Bolte, J., Svaiter, B.F.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward–backward splitting, and regularized Gauss–Seidel methods. Math. Program. 137(1–2), 91–129 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  29. Wu, Z.M., Li, M.: General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems. Comput. Optim. Appl. 73(1), 129–158 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  30. Łojasiewicz, S.: Une propriété topologique des sous-ensembles analytiques réels. In: Colloques internationaux du C.N.R.S.: Les équations aux dérivées partielles, Paris (1962), Éditions du Centre National de la Recherche Scientifique, Paris, pp. 87–89 (1963)

  31. Kurdyka, K.: On gradients of functions definable in \(o\)-minimal structures. Ann. Inst. Fourier 48, 769–783 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  32. Bolte, J., Daniilidis, A., Lewis, A.: The Łojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems. SIAM J. Optim. 17(4), 1205–1223 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  33. Bolte, J., Daniilidis, A., Lewis, A., Shiota, M.: Clarke subgradients of stratifiable functions. SIAM J. Optim. 18(2), 556–572 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  34. Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of Łojasiewicz inequalities: subgradient flows, talweg, convexity. Trans. Am. Math. Soc. 362(6), 3319–3363 (2010)

    Article  MATH  Google Scholar 

  35. Attouch, H., Bolte, J., Redont, P., Soubeyran, A.: Proximal alternating minimization and projection methods for nonconvex problems: an approach based on the Kurdyka–Łojasiewicz inequality. Math. Oper. Res. 35(2), 438–457 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  36. Bolte, J., Sabach, S., Teboulle, M.: Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146(1), 459–494 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  37. Attouch, H., Bolte, J.: On the convergence of the proximal algorithm for nonsmooth functions involving analytic features. Math. Program. 116(1–2), 5–16 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  38. Guo, K., Han, D.R., Wu, T.T.: Convergence of alternating direction method for minimizing sum of two nonconvex functions with linear constraints. Int. J. Comput. Math. 94(8), 1653–1669 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  39. Wu, Z.M., Li, M., Wang, D.Z.W., Han, D.R.: A symmetric alternating direction method of multipliers for separable nonconvex minimization problems. Asia Pac. J. Oper. Res. 34(06), 1750030 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  40. Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Springer, Berlin (1998)

    Book  MATH  Google Scholar 

  41. Bertsekas, D.P.: Nonlinear Programming, 2nd edn. Athena Scientific, Belmont (1999)

    MATH  Google Scholar 

  42. Li, M., Sun, D.F., Toh, K.-C.: A majorized ADMM with indefinite proximal terms for linearly constrained convex composite optimization. SIAM J. Optim. 26(2), 922–950 (2016)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are grateful to thank two anonymous referees for their helpful suggestions on improving the quality of the original manuscript. This work was supported by National Natural Science Foundation of China Grants 11771078 and 71661147004, Natural Science Foundation of Jiangsu Province Grant BK20181258 and Project 333 of Jiangsu Province Grant BRA2018351.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Min Li.

Additional information

Communicated by Alexander S. Strekalovsky.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, M., Wu, Z. Convergence Analysis of the Generalized Splitting Methods for a Class of Nonconvex Optimization Problems. J Optim Theory Appl 183, 535–565 (2019). https://doi.org/10.1007/s10957-019-01564-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-019-01564-1

Keywords

Mathematics Subject Classification

Navigation