Skip to main content

Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems

  • Conference paper
  • First Online:
Mathematical Optimization Theory and Operations Research (MOTOR 2021)

Abstract

In this paper, we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve the ill-posed Cauchy problem for the Poisson equation and make a comparative analysis with the Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of a new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method “doesn’t feel the noise”. But after this moment the noise starts to accumulate and the quality of the solution becomes worse for further iterations.

The research of V.V. Matyukhin and A.V. Gasnikov in Sects. 1,2,3,4 was supported by Russian Science Foundation (project No. 21-71-30005). The research of S.I. Kabanikhin, M.A. Shishlenin and N.S. Novikov in the last section was supported by RFBR 19-01-00694 and by the comprehensive program of fundamental scientific researches of the SB RAS II.1, project No. 0314-2018-0009. The work of A. Vasin was supported by Andrei M. Raigorodskii Scholarship in Optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Indeed, if there exists q such that \(Aq=f\) then for all \(\lambda \) we have \(\left\langle {Aq,\lambda } \right\rangle =\left\langle {f,\lambda } \right\rangle \). Hence, \(\left\langle {q,A^*\lambda } \right\rangle =\left\langle {f,\lambda } \right\rangle \). Assume that there exists a \(\lambda \), such that \(A^*\lambda =0\) and \(\left\langle {f,\lambda } \right\rangle >0\). If it is so we observe a contradiction:

    $$0=\left\langle {q,A^*\lambda } \right\rangle =\left\langle {f,\lambda } \right\rangle >0.$$

    .

  2. 2.

    Recall that \(R=\left\| {q_*} - y^0 \right\| _2\).

  3. 3.

    The mathematical background of the described example see in the full version of the paper [15].

References

  1. Anikin, A., Gasnikov, A., Dvurechensky, P., Turin, A., Chernov, A.: Dual approaches to the strongly convex simple function minimization problem under affine restrictions. arXiv preprint arXiv:1602.01686 (2016)

  2. Chernov, A., Dvurechensky, P., Gasnikov, A.: Fast primal-dual gradient method for strongly convex minimization problems with linear Constraints. In: Kochetov, Y., Khachay, M., Beresnev, V., Nurminski, E., Pardalos, P. (eds.) DOOR 2016. LNCS, vol. 9869, pp. 391–403. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-44914-2_31

    Chapter  Google Scholar 

  3. Devolder, O.: Exactness, inexactness and stochasticity in first-order methods for large-scale convex optimization. Ph.D. thesis (2013)

    Google Scholar 

  4. Dvinskikh, D., Gasnikov, A.: Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems. J. Inverse Ill-posed Problems (2021)

    Google Scholar 

  5. Dvurechensky, P.E., Gasnikov, A.V., Nurminski, E.A., Stonyakin, F.S.: Advances in low-memory subgradient optimization. In: Bagirov, A.M., Gaudioso, M., Karmitsa, N., Mäkelä, M.M., Taheri, S. (eds.) Numerical Nonsmooth Optimization, pp. 19–59. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-34910-3_2

    Chapter  Google Scholar 

  6. Evtushenko, Y.: Optimization and fast automatic differentiation. Preprint CCAS (2013)

    Google Scholar 

  7. Gasnikov, A.: Universal gradient descent. arXiv preprint arXiv:1711.00394 (2017)

  8. Gasnikov, A., Dvurechensky, P., Nesterov, Y.: Stochastic gradient methods with inexact oracle. TRUDY MIPT 8(1), 41–91 (2016)

    MATH  Google Scholar 

  9. Gasnikov, A.V., Nesterov, Y.E.: Universal method for stochastic composite optimization problems. Comput. Math. and Math. Phys. 58(1), 48–64 (2018)

    Article  MathSciNet  Google Scholar 

  10. Gasnikov, A., Tyurin, A.: Fast gradient descent for convex minimization problems with an oracle producing a (\(\delta \), l)-model of function at the requested point. Comput. Math. Math. Phys. 59(7), 1085–1097 (2019)

    Article  MathSciNet  Google Scholar 

  11. Halmos, P.R.: A Hilbert Space Problem Book, vol. 19. Springer, Heidelberg (2012)

    MATH  Google Scholar 

  12. Kabanikhin, S.I.: Definitions and examples of inverse and ill-posed problems. J. Inverse Ill-Posed Probl. 16(4), 317–357 (2008)

    Article  MathSciNet  Google Scholar 

  13. Kamzolov, D., Dvurechensky, P., Gasnikov, A.V.: Universal intermediate gradient method for convex problems with inexact oracle. Optim. Methods Softw. 1–28 (2020)

    Google Scholar 

  14. Kantorovich, L.V.: Functional analysis and applied mathematics. Uspekhi Matematicheskikh Nauk 3(6), 89–185 (1948)

    MathSciNet  MATH  Google Scholar 

  15. Matyukhin, V., Kabanikhin, S., Shishlenin, M., Novikov, N., Vasin, A., Gasnikov, A.: Convex optimization with inexact gradients in hilbert space and applications to elliptic inverse problems. WIAS Preprint 2815 (2021)

    Google Scholar 

  16. Nemirovskij, A.S., Yudin, D.B.: Problem complexity and method efficiency in optimization. Wiley-Interscience (1983)

    Google Scholar 

  17. Nesterov, Y.: Universal gradient methods for convex optimization problems. Math. Program. 152(1), 381–404 (2015)

    Article  MathSciNet  Google Scholar 

  18. Nesterov, Y., Gasnikov, A., Guminov, S., Dvurechensky, P.: Primal-dual accelerated gradient methods with small-dimensional relaxation oracle. Optim. Methods Softw. 1–38 (2020)

    Google Scholar 

  19. Nesterov, Y., et al.: Lectures on Convex Optimization, vol. 137. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91578-4

    Book  MATH  Google Scholar 

  20. Nocedal, J., Wright, S.: Numerical Optimization. Springer, New York (2006)

    MATH  Google Scholar 

  21. Poljak, B.: Iterative algorithms for singular minimization problems. In: Nonlinear Programming 4, pp. 147–166. Elsevier (1981)

    Google Scholar 

  22. Polyak, B.T.: Introduction to Optimization. Optimization software. Inc., Publications Division, New York 1 (1987)

    Google Scholar 

  23. Stonyakin, F., et al.: Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model. arXiv preprint arXiv:2001.09013 (2020)

  24. Stonyakin, F.S., et al.: Gradient methods for problems with inexact model of the objective. In: Khachay, M., Kochetov, Y., Pardalos, P. (eds.) MOTOR 2019. LNCS, vol. 11548, pp. 97–114. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-22629-9_8

    Chapter  Google Scholar 

  25. Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization. SIAM J. Optim. 1 (2008)

    Google Scholar 

  26. Tyurin, A.: Mirror version of similar triangles method for constrained optimization problems. arXiv preprint arXiv:1705.09809 (2017)

  27. Vasiliev, F.: Optimization methods: In: MCCME, vol. 1053, Moscow (2011). 2 books

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vladislav Matyukhin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Matyukhin, V., Kabanikhin, S., Shishlenin, M., Novikov, N., Vasin, A., Gasnikov, A. (2021). Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems. In: Pardalos, P., Khachay, M., Kazakov, A. (eds) Mathematical Optimization Theory and Operations Research. MOTOR 2021. Lecture Notes in Computer Science(), vol 12755. Springer, Cham. https://doi.org/10.1007/978-3-030-77876-7_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-77876-7_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-77875-0

  • Online ISBN: 978-3-030-77876-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics