Abstract
In this paper, we study a first-order inexact primal-dual algorithm (I-PDA) for solving a class of convex-concave saddle point problems. The I-PDA, which involves a relative error criterion and generalizes the classical PDA, has the advantage of solving one subproblem inexactly when it does not have a closed-form solution. We show that the whole sequence generated by I-PDA converges to a saddle point solution with \(\mathcal {O}(1/N)\) ergodic convergence rate, where N is the iteration number. In addition, under a mild calmness condition, we establish the global Q-linear convergence rate of the distance between the iterates generated by I-PDA and the solution set, and the R-linear convergence speed of the nonergodic iterates. Furthermore, we demonstrate that many problems arising from practical applications satisfy this calmness condition. Finally, some numerical experiments are performed to show the superiority and linear convergence behaviors of I-PDA.
Similar content being viewed by others
References
Arrow, K.J., Hurwicz, L., Uzawa, H.: With Contributions by H.B. chenery, S.M. Johnson, S. Karlin, T. Marschak, and R.M. Solow. Studies in Linear and Non-Linear Programming, volume II of Stanford Mathematical Studies in the Social Science. Stanford Unversity Press, Stanford (1958)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)
Cai, X.J., Han, D.R., Xu, L.L.: An improved first-order primal-dual algorithm with a new correction step. J. Glob. Optim. 57(4), 1419–1428 (2013)
Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vis. 40(1), 120–145 (2011)
Chambolle, A., Pock, T.: On the ergodic convergence rates of a first-order primal-dual algorithm. Math. Program. 159(1-2), 253–287 (2016)
Chambolle, A., Ehrhardt, M.J., Richtárik, P., Schonlieb, C.B.: Stochastic primal-dual hybrid gradient algorithm with arbitrary sampling and imaging applications. SIAM J. Optim. 28(4), 2783–2808 (2018)
Chen, P., Huang, J., Zhang, X.: A primal-dual fixed point algorithm for convex separable minimization with applications to image restoration. Inverse Probl. 29(2), 025011 (2013)
Chen, P., Huang, J., Zhang, X.: A primal-dual fixed point algorithm for minimization of the sum of three convex separable functions. Fixed Point Theory A 2016(1), 54 (2016)
Condat, L.: A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms. J. Optim. Theory Appl. 158(2), 460–479 (2013)
Davis, D., Yin, W.T.: A three-operator splitting scheme and its optimization applications. Set.-valued Var. Anal. 25(4), 829–858 (2017)
Dontchev, A.L., Rockafellar, R.T.: Implicit Functions and Solution Mappings. Springer Monographs in Mathematics, p 208. Springer, Berlin (2009)
Duchi, J., Shalev-Shwartz, S., Singer, Y., Chandra, T.: Efficient projections onto the ℓ1-ball for learning in high dimensions. In: Proceedings of the 25th International Conference on Machine Learning, pp 272–279. ACM (2008)
Eckstein, J., Yao, W.: Approximate ADMM algorithms derived from Lagrangian splitting. Comput. Optim. Appl. 68(2), 363–405 (2017)
Eckstein, J., Yao, W.: Relative-error approximate versions of Douglas-Rachford splitting and special cases of the ADMM. Math. Program. 170(2), 417–444 (2018)
Esser, E., Zhang, X.Q., Chan, T.F.: A general framework for a class of first order primal-dual algorithms for convex optimization in imaging science. SIAM J. Imag. Sci. 3(4), 1015–1046 (2010)
Han, D.R., He, H.J., Yang, H., Yuan, X.M.: A customized Douglas-Rachford splitting algorithm for separable convex minimization with linear constraints. Numer. Math. 127(1), 167–200 (2014)
Han, D.R., Sun, D.F., Zhang, L.W.: Linear rate convergence of the alternating direction method of multipliers for convex composite programming. Math. Oper. Res. 43(2), 622–637 (2017)
He, B.S., Yuan, X.M.: Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective. SIAM J. Imag. Sci. 5(1), 119–149 (2012)
He, B.S., Ma, F., Yuan, X.M.: An algorithmic framework of generalized primal-dual hybrid gradient methods for saddle point problems. J. Math. Imag. Vis. 58(2), 279–293 (2017)
Jiang, F., Cai, X.J., Wu, Z.M., Han D.R.: Approximate rst-order primal-dual algorithms for saddle point problems. Math. Comput. (2021). https://doi.org/10.1090/mcom/3610
Malitsky, Y., Pock, T.: A first-order primal-dual algorithm with linesearch. SIAM J. Optim. 28(1), 411–432 (2018)
Möllenhoff, T., Strekalovskiy, E., Moeller, M., Daniel, C.: The primal-dual hybrid gradient method for semiconvex splittings. SIAM J. Imag. Sci. 8(2), 827–857 (2015)
Morini, S., Porcelli, M., Chan, R.H.: A reduced Newton method for constrained linear least squares problems. J. Comput. Appl. Math. 233 (9), 2200–2212 (2010)
Nam, A.S., Davies, M.E., Elad, M., Gribonval, R.: The cosparse analysis model and algorithms. Appl. Comput. Harmon. Anal. 34(1), 30–56 (2013)
Parikh, N., Boyd, S.: Proximal algorithms. Found Trends®Optim. 1(3), 127–239 (2014)
Pedregosa, F., Gidel, G.: Adaptive three operator splitting. arXiv:1804.02339 (2018)
Rasch, J., Chambolle, A.: Inexact first-order primal-dual algorithms. Comput. Optim. Appl. 76, 381–430 (2020). https://doi.org/10.1007/s10589-020-00186-y
Rudin, L., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Phys. D 60(1-4), 227–238 (1992)
Robinson, S.M.: An implicit-function theorem for generalized variational inequalities. Technical Summary Report 1672, Mathematics Research Center University of Wisconsin-Madison; available from National Technical Information Service under Accession ADA031952 (1976)
Robinson, S.M.: Some Continuity Properties of Polyhedral Multifunctions. Mathematical Programming at Oberwolfach, pp 206–214. Springer, Berlin (1981)
Rockafellar, R.T., Wets, R.J.B.: Variational Analysis. Springer Science & Business Media, Berlin (2009)
Rockafellar, R.T.: Convex analysis. Princeton University Press, Princeton (2015)
Sun, T., Barrio, R., Cheng, L., Jiang, H.: Precompact convergence of the nonconvex primal-dual hybrid gradient algorithm. J. Comput. Appl. Math. 330, 15–27 (2018)
Xie, J.X.: On inexact ADMMs with relative error criteria. Comput. Optim. Appl. 71(3), 743–765 (2018)
Yan, M.: A new primal-dual algorithm for minimizing the sum of three functions with a linear operator. J. Sci. Comput. 76(3), 1698–1717 (2018)
Zhao, T., Eldar, Y.C., Beck, A., Nehorai, A.: Smoothing and decomposition for analysis sparse recovery. IEEE Trans. Signal Process. 62(7), 1762–1774 (2014)
Zhu, M.Q., Chan, T.F.: An efficient primal-dual hybrid gradient algorithm for total variation image restoration. UCLA CAM Report (2008)
Funding
This research was partially supported by the National Natural Science Foundation of China under grants 11571178, 11871279 and 12001286, by the China Scholarship Council, by the Postgraduate Research & Practice Innovation Program of Jiangsu Province KYCX20_1163, and by the USA National Science Foundation under grant 1819161.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Jiang, F., Wu, Z., Cai, X. et al. A first-order inexact primal-dual algorithm for a class of convex-concave saddle point problems. Numer Algor 88, 1109–1136 (2021). https://doi.org/10.1007/s11075-021-01069-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-021-01069-x
Keywords
- Convex optimization
- Saddle point problems
- First-order primal-dual algorithm
- Inexact
- Nonergodic convergence
- Linear convergence