Abstract
In a Hilbert space, we study the convergence of the subgradient method to a solution of a variational inequality, under the presence of computational errors. Most results known in the literature establish convergence of optimization algorithms, when computational errors are summable. In the present paper, the convergence of the subgradient method for solving variational inequalities is established for nonsummable computational errors. We show that the subgradient method generates a good approximate solution, if the sequence of computational errors is bounded from above by a constant.
Similar content being viewed by others
References
Alber, Ya.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81, 23–35 (1998)
Barty, K., Roy, J.-S., Strugarek, C.: Hilbert-valued perturbed subgradient algorithms. Math. Oper. Res. 32, 551–562 (2007)
Burachik, R.S., Grana Drummond, L.M., Iusem, A.N., Svaiter, B.F.: Full convergence of the steepest descent method with inexact line searches. Optimization 32, 137–146 (1995)
Burachik, R.S., Lopes, J.O., Da Silva, G.J.P.: An inexact interior point proximal method for the variational inequality. Comput. Appl. Math. 28, 15–36 (2009)
Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011)
Demyanov, V.F., Vasil’ev, L.V.: Nondifferentiable Optimization. Nauka, Moscow (1981)
Gwinner, J., Raciti, F.: On monotone variational inequalities with random data. J. Math. Inequal. 3, 443–453 (2009)
Huebner, E., Tichatschke, R.: Relaxed proximal point algorithms for variational inequalities with multi-valued operators. Optim. Methods Softw. 23, 847–877 (2008)
Kaplan, A., Tichatschke, R.: Bregman-like functions and proximal methods for variational problems with nonlinear constraints. Optimization 56, 253–265 (2007)
Kiwiel, K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14, 807–840 (2003)
Konnov, I.V.: A descent method with inexact linear search for mixed variational inequalities. Russ. Math. 53, 29–35 (2009)
Mainge, P.-E.: Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization. Set-Valued Anal. 16, 899–912 (2008)
Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekonom. Mat. Metody 12, 747–756 (1976)
Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems, vols. I–II. Springer, New York (2003)
Zaslavski, A.J.: The projected subgradient method for nonsmooth convex optimization in the presence of computational errors. Numer. Funct. Anal. Optim. 31, 616–633 (2010)
Zaslavski, A.J.: Convergence of a proximal method in the presence of computational errors in Hilbert spaces. SIAM J. Optim. 20, 2413–2421 (2010)
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Vladimir F. Dem’yanov.
Rights and permissions
About this article
Cite this article
Zaslavski, A.J. The Extragradient Method for Solving Variational Inequalities in the Presence of Computational Errors. J Optim Theory Appl 153, 602–618 (2012). https://doi.org/10.1007/s10957-011-9975-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-011-9975-3