Skip to main content
Log in

Strong convergence theorems for inertial Tseng’s extragradient method for solving variational inequality problems and fixed point problems

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

The aim of this paper is to introduce a new inertial Tseng’s extragradient algorithm for solving variational inequality problems with pseudo-monotone and Lipschitz continuous mappings and fixed point problems for nonexpansive mappings in real Hilbert spaces. We prove a strong convergence theorem for the proposed algorithm under suitable assumptions imposed on the parameters. Finally, we give some numerical experiments for supporting our main results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Baiocchi, C., Capelo, A.: Variational and Quasivariational Inequalities: Applications to Free boundary problems. Wiley, New York (1984)

    MATH  Google Scholar 

  2. Kinderlehrer, D., Stampacchia, G.: An Introduction to Variational Inequalities and Their Applications. Academic Press, New York (1980)

    MATH  Google Scholar 

  3. Facchinei, F., Pang, J. S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer Series in Operations Research, vol. I. Springer, New York (2003)

  4. Konnov, I.V.: Combined Relaxation Methods for Variational Inequalities. Springer, Berlin (2001)

    MATH  Google Scholar 

  5. Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011)

    MathSciNet  MATH  Google Scholar 

  6. Censor, Y., Gibali, A., Reich, S.: Extensions of Korpelevich’s extragradient method for the variational inequality problem in Euclidean space. Optimization 61, 1119–1132 (2011)

    MathSciNet  MATH  Google Scholar 

  7. Censor, Y., Gibali, A., Reich, S.: Algorithms for the split variational inequality problem. Numer. Algorithms 56, 301–323 (2012)

    MathSciNet  MATH  Google Scholar 

  8. Hu, X., Wang, J.: Solving pseudo-monotone variational inequalities and pseudo-convex optimization problems using the projection neural network. IEEE Trans. Neural Netw. 17, 1487–1499 (2006)

    Google Scholar 

  9. Iusem, A.N., Nasri, M.: Korpelevich’s method for variational inequality problems in Banach spaces. J. Glob. Optim. 50, 59–76 (2011)

    MathSciNet  MATH  Google Scholar 

  10. Mainge, P.E.: A hybrid extragradient-viscosity method for monotone operators and fixed point problems. SIAM J. Control Optim. 47, 1499–1515 (2008)

    MathSciNet  MATH  Google Scholar 

  11. Malitsky, Y.V.: Projected reflected gradient methods for monotone variational inequalities. SIAM J. Optim. 25, 502–520 (2015)

    MathSciNet  MATH  Google Scholar 

  12. Malitsky, Y.V., Semenov, V.V.: A hybrid method without extrapolation step for solving variational inequality problems. J. Glob. Optim. 61, 193–202 (2015)

    MathSciNet  MATH  Google Scholar 

  13. Solodov, M.V., Svaiter, B.F.: A new projection method for variational inequality problems. SIAM J. Control Optim. 37, 765–776 (1999)

    MathSciNet  MATH  Google Scholar 

  14. Solodov, M.V., Tseng, P.: Modified projection-type methods for monotone variational inequalities. SIAM J. Control Optim. 34, 1814–1830 (1996)

    MathSciNet  MATH  Google Scholar 

  15. Thong, D.V., Hieu, D.V.: Weak and strong convergence theorems for variational inequality problems. Numer. Algorithm 78, 1045–1060 (2018)

    MathSciNet  MATH  Google Scholar 

  16. Thong, D.V., Hieu, D.V.: Modified subgradient extragradient algorithms for variational inequality problems and fixed point problems. Optimization 67, 83–102 (2018)

    MathSciNet  MATH  Google Scholar 

  17. Thong, D.V., Hieu, D.V.: Modified subgradient extragradient method for variational inequality problems. Numer. Algorithm 79, 597–610 (2018)

    MathSciNet  MATH  Google Scholar 

  18. Thong, D.V., Hieu, D.V.: Inertial extragradient algorithms for strongly pseudomonotone variational inequalities. J. Comput. Appl. Math. 341, 80–98 (2018)

    MathSciNet  MATH  Google Scholar 

  19. Dong, Q.L., Cho, Y.J., Zhong, L.L., Rassias, Th: M: Inertial projection and contraction algorithms for variational inequalities. J. Glob. Optim. 70, 687–704 (2018)

    MathSciNet  MATH  Google Scholar 

  20. Dong, Q.L., Yuan, H.B., Cho, Y.J., Rassias, Th: M: Modified inertial Mann algorithm and inertial CQ-algorithm for nonexpansive mappings. Optim. Lett. 12, 87–102 (2018)

    MathSciNet  MATH  Google Scholar 

  21. Gibali, A., Reich, S., Zalas, R.: Outer approximation methods for solving variational inequalities in Hilbert space. Optimization 66, 417–437 (2017)

    MathSciNet  MATH  Google Scholar 

  22. Kraikaew, R., Saejung, S.: Strong convergence of the Halpern subgradient extragradient method for solving variational inequalities in Hilbert spaces. J. Optim. Theory Appl. 163, 399–412 (2014)

    MathSciNet  MATH  Google Scholar 

  23. Kassay, G., Reich, S., Sabach, S.: Iterative methods for solving systems of variational inequalities in refelexive Banach spaces. SIAM J. Optim. 21, 1319–1344 (2011)

    MathSciNet  MATH  Google Scholar 

  24. Tseng, P.: A modified forward-backward splitting method for maximal monotone mappings. SIAM J. Control Optim. 38, 431–446 (2000)

    MathSciNet  MATH  Google Scholar 

  25. Bot, R.I., Csetnek, E.R.: Regularity conditions via generalized interiority notions in convex optimization: new achievements and their relation to some classical statements. Optimization 61(1), 35–65 (2012)

    MathSciNet  MATH  Google Scholar 

  26. Bot, R.I., Csetnek, E.R., Heinrich, A.: A primal-dual splitting algorithm for finding zeros of sums of maximally monotone operators. SIAM J. Optim. 23(4), 2011–2036 (2013)

    MathSciNet  MATH  Google Scholar 

  27. Bot, R.I., Hendrich, C.: A Douglas–Rachford type primal-dual method for solving inclusions with mixtures of composite and parallel-sum type monotone operators. SIAM J. Optim. 23(4), 2541–2565 (2013)

    MathSciNet  MATH  Google Scholar 

  28. Bot, R.I., Csetnek, E.R., Heinrich, A., Hendrich, C.: On the convergence rate improvement of a primal-dual splitting algorithm for solving monotone inclusion problems. Math. Program. 150, 251–279 (2015)

    MathSciNet  MATH  Google Scholar 

  29. Bot, R.I., Csetnek, E.R.: An inertial Tseng’s type proximal algorithm for nonsmooth and nonconvex optimization problems. J. Optim. Theory Appl. 171, 600–616 (2016)

    MathSciNet  MATH  Google Scholar 

  30. Bot, R.I., Csetnek, E.R., Hendrich, C.: Inertial Douglas–Rachford splitting for monotone inclusion. Appl. Math. Comput. 256, 472–487 (2015)

    MathSciNet  MATH  Google Scholar 

  31. Shehu, Y.: Iterative methods for split feasibility problems in certain Banach spaces. J. Nonlinear Convex Anal. 16, 2315–2364 (2015)

    MathSciNet  MATH  Google Scholar 

  32. Shehu, Y., Iyiola, O.S., Enyi, C.D.: An iterative algorithm for solving split feasibility problems and fixed point problems in Banach spaces. Numer. Algorithm 72, 835–864 (2016)

    MathSciNet  MATH  Google Scholar 

  33. Shehu, Y., Iyiola, O.S.: Convergence analysis for the proximal split feasibility problem using an inertial extrapolation term method. J. Fixed Point Theory Appl. 19, 2483–2510 (2017)

    MathSciNet  MATH  Google Scholar 

  34. Korpelevich, G.M.: The extragradientmethod for finding saddle points and other problems. Ekon. Mat. Metody 12, 747–756 (1976)

    MathSciNet  MATH  Google Scholar 

  35. Thong, D.V., Vinh, N.T., Cho, Y.J.: A strong convergence theorem for Tseng’s extragradient method for solving variational inequality problems. Optim. Lett. (2019). https://doi.org/10.1007/s11590-019-01391-3

    Article  MATH  Google Scholar 

  36. Thong, D.V., Hieu, D.V.: Modified Tseng’s extragradient algorithms for variational inequality problems. J. Fixed Point Theory Appl. 20, 152 (2018)

    MathSciNet  MATH  Google Scholar 

  37. Censor, Y., Gibali, A., Reich, S.: Strong convergence of subgradient extragradient methods for the variational inequality problem in Hilbert space. Optim. Methods Softw. 26, 82–845 (2011)

    MathSciNet  MATH  Google Scholar 

  38. Goebel, K., Reich, S.: Uniform Convexity, Hyperbolic Geometry, and Nonexpansive mappings. Marcel Dekker, New York (1984)

    MATH  Google Scholar 

  39. Cottle, R.W., Yao, J.C.: Pseudo-monotone complementarity problems in Hilbert space. J. Optim. Theory Appl. 75, 281–295 (1992)

    MathSciNet  MATH  Google Scholar 

  40. Xu, H.K.: Iterative algorithms for nonlinear operators. J. Lond. Math. Soc. 66, 240–256 (2002)

    MathSciNet  MATH  Google Scholar 

  41. Cholamjiak, P., Thong, D.V., Cho, Y.J.: A novel inertial projection and contraction method for solving pseudomonotone variational inequality problems. Acta Appl. Math. (2019). https://doi.org/10.1007/s10440-019-00297-7

    Article  MATH  Google Scholar 

  42. Goebel, K., Kirk, W.A.: On Metric Fixed Point Theory. Cambridge University Press, Cambridge (1990)

    MATH  Google Scholar 

  43. Suantai, S., Pholasa, N., Cholamjiak, P.: Themodified inertial relaxed CQ algorithm for solving the split feasibility problems. J. Ind. Manag. Optim. 13, 1–21 (2018)

    Google Scholar 

  44. Harker, P. T., Pang, J.-S.: A damped-Newton method for the linear complementarity problem, In: Allgower, G., Georg, K. (eds.) Computational Solution of Nonlinear Systems of Equations, Vol. 26, pp. 265–284. Lectures in Appl. Math. AMS, Providence (1990)

  45. Hieu, D.V., Anh, P.K., Muu, L.D.: Modified hybrid projection methods for finding common solutions to variational inequality problems. Comput. Optim. Appl. 66, 75–96 (2017)

    MathSciNet  MATH  Google Scholar 

  46. Solodov, M.V., Svaiter, B.F.: A new projection method for variational inequality problems. SIAM J. Control Optim. 37, 765–776 (1999)

    MathSciNet  MATH  Google Scholar 

  47. Thong, D.V., Hieu, D.V.: Weak and strong convergence theorems for variational inequality problems. Numer. Algor. 78, 1045–1060 (2018)

    MathSciNet  MATH  Google Scholar 

  48. Ceng, L.C., Yao, J.C.: Strong convergence theorem by an extragradient method for fixed point problems and variational inequality problems. Taiwan. J. Math. 10, 1293–1303 (2006)

    MathSciNet  MATH  Google Scholar 

  49. Ceng, L.C., Hadjisavvas, N., Wong, N.C.: Strong convergence theorem by a hybrid extragradientlike approximation method for variational inequalities and fixed point problems. J. Glob. Optim. 46, 635–646 (2010)

    MATH  Google Scholar 

  50. Nadezhkina, N., Takahashi, W.: Strong convergence theorem by a hybrid method for nonexpansive mappings and Lipschitz-continuous monotone mappings. SIAM J. Optim. 16, 1230–1241 (2006)

    MathSciNet  MATH  Google Scholar 

  51. Nadezhkina, N., Takahashi, W.: Weak convergence theorem by an extragradient method for nonexpansive mappings and monotone mappings. J. Optim. Theory Appl. 128, 191–201 (2006)

    MathSciNet  MATH  Google Scholar 

  52. Iiduka, H., Yamada, I.: A use of conjugate gradient direction for the convex optimization problem over the fixed point set of a nonexpansive mapping. SIAM J. Optim. 19, 1881–1893 (2008)

    MathSciNet  MATH  Google Scholar 

  53. Iiduka, H., Yamada, I.: A subgradient-type method for the equilibrium problem over the fixed point set and its applications. Optimization 58, 251–261 (2009)

    MathSciNet  MATH  Google Scholar 

  54. Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekonomikai Matematicheskie Metody 12, 747–756 (1976)

    MathSciNet  MATH  Google Scholar 

  55. Mainge, P.E.: Projected subgradient techniques and viscosity methods for optimization with variational inequality constraints. Eur. J. Oper. Res. 205, 501–506 (2010)

    MathSciNet  MATH  Google Scholar 

  56. Takahashi, W., Toyoda, M.: Weak convergence theorems for nonexpansive mappings and monotone mappings. J. Optim. Theory Appl. 118, 417–428 (2003)

    MathSciNet  MATH  Google Scholar 

  57. Ceng, L.C., Mordukhovich, B.S., Yao, J.C.: Hybrid Approximate Proximal Method with Auxiliary Variational Inequality for Vector Optimization. J. Optim. Theory Appl. 146, 267–303 (2010)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported by the NSF of China (Grant Nos. 11771063, 11971082), the Natural Science Foundation of Chongqing(cstc2020jcyj-msxmX0455), Science and Technology Project of Chongqing Education Committee (Grant No. KJZD-K201900504), the Program of Chongqing Innovation Research Group Project in University (Grant no. CXQT19018).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gang Cai.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cai, G., Dong, QL. & Peng, Y. Strong convergence theorems for inertial Tseng’s extragradient method for solving variational inequality problems and fixed point problems. Optim Lett 15, 1457–1474 (2021). https://doi.org/10.1007/s11590-020-01654-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-020-01654-4

Keywords

Navigation