Skip to main content
Log in

Proximal gradient methods for multiobjective optimization and their applications

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

We propose new descent methods for unconstrained multiobjective optimization problems, where each objective function can be written as the sum of a continuously differentiable function and a proper convex but not necessarily differentiable one. The methods extend the well-known proximal gradient algorithms for scalar-valued nonlinear optimization, which are shown to be efficient for particular problems. Here, we consider two types of algorithms: with and without line searches. Under mild assumptions, we prove that each accumulation point of the sequence generated by these algorithms, if exists, is Pareto stationary. Moreover, we present their applications in constrained multiobjective optimization and robust multiobjective optimization, which is a problem that considers uncertainties. In particular, for the robust case, we show that the subproblems of the proximal gradient algorithms can be seen as quadratic programming, second-order cone programming, or semidefinite programming problems. Considering these cases, we also carry out some numerical experiments, showing the validity of the proposed methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. We denote \(A \succeq (\succ ) O\) when A is positive semidefinite (positive definite). Also, \(A \succeq (\succ ) B\) if and only if \(A - B \succeq (\succ ) O\).

  2. Here, \(\dim \) denotes dimension of a space and \(\ker \) means kernel of a matrix.

References

  1. Alizadeh, F., Goldfarb, D.: Second-order cone programming. Math. Program. 95(1), 1–52 (2001)

    MathSciNet  MATH  Google Scholar 

  2. Beck, A., Eldar, Y.: Strong duality in nonconvex quadratic optimization with two quadratic constraints. SIAM J. Optim. 17(3), 844–860 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  3. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  4. Ben-Tal, A., Nemirovski, A.: Robust convex optimization. Math. Oper. Res. 23(4), 769–805 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  5. Berge, C., Patterson, E.M.: Topological Spaces. Dover Publications, Edinburgh (1963)

    Google Scholar 

  6. Bertsekas, D.P.: Nonlinear Programming, 2nd edn. Athena Scientific, Belmont (1999)

    MATH  Google Scholar 

  7. Bertsekas, D.P.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)

    MATH  Google Scholar 

  8. Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15(4), 953–970 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  9. Chen, G., Teboulle, M.: Convergence analysis of a proximal-like minimization algorithm using Bregman functions. SIAM J. Optim. 3(3), 538–543 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  10. Cruz Neto, J.X., Silva, G.J.P., Ferreira, O.P., Lopes, J.O.: A subgradient method for multiobjective optimization. Comput. Optim. Appl. 54(3), 461–472 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  11. Ehrgott, M., Ide, J., Schöbel, A.: Minmax robustness for multi-objective optimization problems. Eur. J. Oper. Res. 239(1), 17–31 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  12. Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  13. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  14. Fliege, J., Werner, R.: Robust multiobjective optimization and applications in portfolio optimization. Eur. J. Oper. Res. 234(2), 422–433 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  15. Fukuda, E.H., Graña Drummond, L.M.: A survey on multiobjective descent methods. Pesquisa Operacional 34(3), 585–620 (2014)

    Article  Google Scholar 

  16. Geoffrion, A.M.: Proper efficiency and the theory of vector maximization. J. Math. Anal. Appl. 22(3), 618–630 (1968)

    Article  MathSciNet  MATH  Google Scholar 

  17. Graña Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28(1), 5–29 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  18. Hogan, W.: Point-to-set maps in mathematical programming. SIAM Rev. 15(3), 591–603 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  19. Morishita, M.: A descent method for robust multiobjective optimization in the presence of implemention errors. Master’s thesis, Kyoto University (2016)

  20. Saul, G., Thomas, S.: The computational algorithm for the parametric objective function. Naval Res. Logist. Q. 2(12), 39–45 (1955)

    MathSciNet  Google Scholar 

  21. Sturm, J.F.: Using SeDuMi 1.02, a Matlab toolbox for optimization over symmetric cones. Optim. Methods Softw. 11(1–4), 625–653 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  22. Tseng, P.: Approximation accuracy, gradient methods, and error bound for structured convex optimization. Math. Program. 125(2), 263–295 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  23. Zadeh, L.: Optimality and non-scalar-valued performance criteria. IEEE Trans. Autom. Control 8(1), 59–60 (1963)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the Kyoto University Foundation, and the Grant-in-Aid for Scientific Research (C) (17K00032) from Japan Society for the Promotion of Science. We are also grateful to the anonymous referees for their useful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ellen H. Fukuda.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tanabe, H., Fukuda, E.H. & Yamashita, N. Proximal gradient methods for multiobjective optimization and their applications. Comput Optim Appl 72, 339–361 (2019). https://doi.org/10.1007/s10589-018-0043-x

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-018-0043-x

Keywords

Navigation