Skip to main content
Log in

Strong convergence theorem for a new Bregman extragradient method with a different line-search process for solving variational inequality problems in reflexive Banach spaces

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this paper, we introduce a new Bregman extragradient method with a different line-search process for solving variational inequality problems in reflexive Banach spaces. Precisely, we prove that the sequence generated by our proposed iterative algorithm converges strongly to an element of the solution sets of variational inequality problems. Moreover, some numerical examples are given to show the effectiveness of the proposed algorithm. The results obtained in this paper extend and improve many recent ones in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availability

My manuscript has no associated data.

References

  1. Kinderlehrer, D., Stampacchia, G.: An Introduction to Variational Inequalities and Their Applications. Academic Press, New York (1980)

    Google Scholar 

  2. Maingé, P.E.: A hybrid extragradient-viscosity method for monotone operators and fixed point problems. SIAM J. Control Optim. 47, 1499–1515 (2008)

    MathSciNet  Google Scholar 

  3. Iusem, A.N., Nasri, M.: Korpelevich method for variational inequality problems in Banach spaces. J. Glob. Optim. 50, 59–76 (2011)

    MathSciNet  Google Scholar 

  4. Censor, Y., Gibali, A., Reich, S.: Strong convergence of subgradient extragradient methods for the variational inequality problem in Hilbert space. Optim. Methods Softw. 26, 827–845 (2011)

    MathSciNet  Google Scholar 

  5. Censor, Y., Gibali, A., Reich, S.: Extensions of Korpelevich’s extragradient method for the variational inequality problem in Euclidean space. Optimization 61, 1119–1132 (2012)

    MathSciNet  Google Scholar 

  6. Kraikaew, R., Saejung, S.: Strong convergence of the Halpern subgradient extragradient method for solving variational inequalities in Hilbert spaces. J. Optim. Theory Appl. 163, 399–412 (2014)

    MathSciNet  Google Scholar 

  7. Malitsky, Y.V.: Projected reflected gradient methods for monotone variational inequalities. SIAM J. Optim. 25, 502–520 (2015)

    MathSciNet  Google Scholar 

  8. Gibali, A., Reich, S., Zalas, R.: Outer approximation methods for solving vaiational inequalities in Hilbert space. Optimization 66, 417–437 (2017)

    MathSciNet  Google Scholar 

  9. Cegielski, A., Gibali, A., Reich, S., Zalas, R.: Outer approximation methods for solving variational inequalities defined over the solution set of a split convex feasibility problem. Numer. Funct. Anal. Optim. 41, 1089–1108 (2020)

    MathSciNet  Google Scholar 

  10. Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Matecon 12, 747–756 (1976)

    MathSciNet  Google Scholar 

  11. Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011)

    MathSciNet  Google Scholar 

  12. Tseng, P.: A modified forward-backward splitting method for maximal monotone mappings. SIAM J. Control Optim. 38, 431–446 (2000)

    MathSciNet  Google Scholar 

  13. Thong, D.V., Vinh, N.T., Cho, Y.J.: Accelerated subgradient extragradient methods for variational inequality problems. J. Sci. Comput. 80, 1438–1462 (2019)

    MathSciNet  Google Scholar 

  14. Thong, D.V., Shehu, Y., Iyiola, O.S.: Weak and strong convergence theorems for solving pseudo-monotone variational inequalities with non-Lipschitz mappings. Numer. Algorithms 84, 795–823 (2020)

    MathSciNet  Google Scholar 

  15. Thong, D.V., Triet, N.A., Li, X.H., Dong, Q.L.: Strong convergence of extragradient methods for solving bilevel pseudo-monotone variational inequality problems. Numer. Algorithms 83, 1123–1143 (2020)

    MathSciNet  Google Scholar 

  16. Thong, D.V., Vinh, N.T., Cho, Y.J.: A strong convergence theorem for Tseng’s extragradient method for solving variational inequality problems. Optim Lett. 14, 1157–1175 (2020)

    MathSciNet  Google Scholar 

  17. Dong, Q.L., Cho, Y.J., Zhong, L.L., Rassias, T.M.: Inertial projection and contraction algorithms for variational inequalities. J. Glob. Optim. 70, 687–704 (2018)

    MathSciNet  Google Scholar 

  18. Dong, Q.L., Yuan, H.B., Cho, Y.J., Rassias, T.M.: Modified inertial Mann algorithm and inertial CQ-algorithm for nonexpansive mappings. Optim. Lett. 12, 87–102 (2018)

    MathSciNet  Google Scholar 

  19. Cai, G., Gibali, A., Iyiola, O.S., Shehu, Y.: A new double-projection method for solving variational inequalities in Banach spaces. J. Optim. Theory Appl. 178, 219–239 (2018)

    MathSciNet  Google Scholar 

  20. Shehu, Y., Iyiola, O.S.: Convergence analysis for the proximal split feasibility problem using an inertial extrapolation term method. J. Fixed Point Theory Appl. 19, 2483–2510 (2017)

    MathSciNet  Google Scholar 

  21. Shehu, Y., Iyiola, O.S.: Iterative algorithms for solving fixed point problems and variational inequalities with uniformly continuous monotone operators. Numer. Algorithms 79, 529–553 (2018)

    MathSciNet  Google Scholar 

  22. Shehu, Y., Dong, Q.-L., Jiang, D.: Single projection method for pseudo-monotone variational inequality in Hilbert spaces. Optimization 68, 385–409 (2019)

    MathSciNet  Google Scholar 

  23. Shehu, Y., Gibali, A., Sagratella, S.: Inertial projection-type methods for solving quasi-variational inequalities in real Hilbert spaces. J. Optim. Theory Appl. 184, 877–894 (2020)

    MathSciNet  Google Scholar 

  24. Shehu, Y., Liu, L.L., Mu, X.W., Dong, Q.-L.: Analysis of versions of relaxed inertial projection and contraction method. Appl. Numer. Math. 165, 1–21 (2021)

    MathSciNet  Google Scholar 

  25. Shehu, Y., Iyiola, O.S., Thong, D.V., Van, N.T.C.: An inertial subgradient extragradient algorithm extended to pseudomonotone equilibrium problems. Math. Methods Oper. Res. 93, 213–242 (2021)

    MathSciNet  Google Scholar 

  26. Shehu, Y.: Single projection algorithm for variational inequalities in Banach spaces with application to contact problem. Acta Math. Sci. Ser. B 40, 1045–1063 (2020)

    MathSciNet  Google Scholar 

  27. Gibali, A., Liu, L.W., Tang, Y.C.: Note on the modified relaxation CQ algorithm for the split feasibility problem. Optim. Lett. 12, 817–830 (2018)

    MathSciNet  Google Scholar 

  28. Gibali, A., Küfer, K.H., Reem, D., Süss, P.: A generalized projection-based scheme for solving convex constrained optimization problems. Comput. Optim. Appl. 70, 737–762 (2018)

    MathSciNet  Google Scholar 

  29. Gibali, A., Thong, D.V., Tuan, P.A.: Two simple projection-type methods for solving variational inequalities. Anal. Math. Phys. 9, 2203–2225 (2019)

    MathSciNet  Google Scholar 

  30. Gibali, A., Thong, D.V.: A new low-cost double projection method for solving variational inequalities. Optim. Eng. 21, 1613–1634 (2020)

    MathSciNet  Google Scholar 

  31. Tang, Y., Gibali, A.: New self-adaptive step size algorithms for solving split variational inclusion problems and its applications. Numer. Algorithms 83, 305–331 (2020)

    MathSciNet  Google Scholar 

  32. Jolaoso, L.O., Shehu, Y.: Single Bregman projection method for solving variational inequalities in reflexive Banach spaces. Appl. Anal. 101, 4807–4828 (2022)

    MathSciNet  Google Scholar 

  33. Bauschke, H.H., Borwein, J.M., Combettes, P.L.: Essential smoothness, essential strict convexity and Legendre functions in Banach space. Comm. Contemp. Math. 3, 615–647 (2001)

    MathSciNet  Google Scholar 

  34. Reem, D., Reich, S.: Solutions to inexact resolvent inclusion problems with applications to nonlinear analysis and optimization. Rendiconti del Circolo Matematico di Palermo 67, 337–371 (2018)

    MathSciNet  Google Scholar 

  35. Reem, D., Reich, S., De Pierro, A.: Re-examination of Bregman functions and new properties of their divergences. Optimization 68, 279–348 (2019)

    MathSciNet  Google Scholar 

  36. Butnariu, D., Reich, S., Zaslavski, A.J.: There are many totally convex functions. J. Convex Anal. 13, 623–632 (2006)

    MathSciNet  Google Scholar 

  37. Reich, S., Sabach, S.: A strong convergence theorem for a proximal-type algorithm in reflexive Banach spaces. J. Nonlinear Convex Anal. 10, 471–485 (2009)

    MathSciNet  Google Scholar 

  38. Butnariu, D., Iusem, A.N.: Totally convex functions for fixed points computation and infinite dimensional optimization. Springer, Newyork (2000)

    Google Scholar 

  39. Censor, Y., Lent, A.: An iterative row-action method for interval convex programming. J. Optim. Theory Appl. 34, 321–353 (1981)

    MathSciNet  Google Scholar 

  40. Maingé, P.E.: Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization. Set-valued Anal. 16, 899–912 (2008)

    MathSciNet  Google Scholar 

  41. Xu, H.K.: Iterative algorithms for nonlinear operators. J. London Math. Soc. 66, 240–256 (2002)

    MathSciNet  Google Scholar 

  42. Thong, D.V., Van Hieu, D.: Strong convergence of extragradient methods with a new step size for solving variational inequality problems. Comput. Appl. Math. 38, 136 (2019)

    MathSciNet  Google Scholar 

  43. Thong, D.V., Gibali, A.: Extragradient methods for solving non-Lipschitzian pseudo-monotone variational inequalities. J. Fixed Point Theory Appl. 21, 1–19 (2019)

    MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was supported by the NSF of China (Grant No 12171435).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Shaotao Hu or Yuanheng Wang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, S., Wang, Y., Jing, P. et al. Strong convergence theorem for a new Bregman extragradient method with a different line-search process for solving variational inequality problems in reflexive Banach spaces. Optim Lett 18, 783–801 (2024). https://doi.org/10.1007/s11590-023-02019-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-023-02019-3

Keywords

Mathematics Subject Classification