Abstract
This paper introduces a Riemannian BFGS method to address multiobjective optimization problems with strongly retraction-convex objective functions. This method is a natural extension of the Euclidean version and produces a sequence of iterates that converges to a Pareto optimal point regardless of the initial point. The main component of our globalization strategy is a generalized Wolfe line search. Numerical experiments demonstrate the superiority and effectiveness of the proposed algorithm.


Similar content being viewed by others
Data Availability
The data that support the findings of this study are available from the corresponding author upon reasonable request.
References
Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)
Assunção, P.B., Ferreira, O.P., Prudente, L.F.: Conditional gradient method for multiobjective optimization. Comput. Optim. Appl. 78(3), 741–768 (2021)
Bello Cruz, J.Y., Lucambio Pérez, L.R., Melo, J.G.: Convergence of the projected gradient method for quasiconvex multiobjective optimization. Nonlinear Anal. Theory Methods Appl. 74(16), 5268–5273 (2011)
Bento, G.C., Ferreira, O.P., Oliveira, P.R.: Unconstrained steepest descent method for multicriteria optimization on Riemannian manifolds. J. Optim. Theory Appl. 154(1), 88–107 (2012)
Bento, G.C., Neto, J.C.: A subgradient method for multiobjective optimization on Riemannian manifolds. J. Optim. Theory Appl. 159(1), 125–137 (2013)
Bento, G.C., Neto, J.C., Meireles, L.V.: Proximal point method for locally Lipschitz functions in multiobjective optimization of Hadamard manifolds. J. Optim. Theory Appl. 179(1), 37–52 (2018)
Bento, G.C., Neto, J.C., Santos, P.: An inexact steepest descent method for multicriteria optimization on Riemannian manifolds. J. Optim. Theory Appl. 159, 108–124 (2013)
Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15, 953–970 (2005)
Boumal, N.: An Introduction to Optimization on Smooth Manifolds. Cambridge University Press, Cambridge (2023)
Cai, T., Song, L., Li, G., Liao, M.: Multi-task learning with Riemannian optimization. In: ICIC 2021: Intelligent Computing Theories and Application, volume 12837 of Lecture Notes in Computer Science, pp. 499–509 (2021)
Carrizo, G.A., Lotito, P.A., Maciel, M.C.: Trust-region globalization strategy for the nonconvex unconstrained multiobjective optimization problem. Math. Program. 159, 339–369 (2016)
Carrizosa, E., Frenk, J.B.G.: Dominating sets for convex functions with some applications. J. Optim. Theory Appl. 96(2), 281–295 (1998)
Chuong, T.D.: Newton-like methods for efficient solutions in vector optimization. Comput. Optim. Appl. 54(3), 495–516 (2013)
Cocchi, G., Liuzzi, G., Lucidi, S., Sciandrone, M.: On the convergence of steepest descent methods for multiobjective optimization. Comput. Optim. Appl. 77(1), 1–27 (2020)
Das, I., Dennis, J.E.: A closer look at drawbacks of minimizing weighted sums of objectives for Pareto set generation in multicriteria optimization problems. Struct. Optim. 14, 63–69 (1997)
Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)
Deb, K.: Multiobjective Optimization using Evolutionary Algorithms. Wiley, New York (2001)
Eschenauer, H., Koski, J., Osyczka, A.: Multicriteria Design Optimization. Springer, Berlin (1990)
Eslami, N., Najafi, B., Vaezpour, S.M.: A trust-region method for solving multicriteria optimization problems on riemannian manifolds. J. Optim. Theory Appl. 196(1), 212–239 (2023)
Evans, G.W.: An overview of techniques for solving multiobjective mathematical programs. Manage. Sci. 30(11), 1268–1282 (1984)
Ferreira, O.P., Louzeiro, M.S., Prudente, L.F.: Iteration-complexity and asymptotic analysis of steepest descent method for multiobjective optimization on Riemannian manifolds. J. Optim. Theory Appl. 184(2), 507–533 (2020)
Fliege, J.: OLAF–A general modeling system to evaluate and optimize the location of an air polluting facility. OR Spektrum 23, 117–136 (2001)
Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)
Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)
Fliege, J., Vicente, L.N.: Multicriteria approach to bilevel optimization. J. Optim. Theory Appl. 131(2), 209–225 (2006)
Fukuda, E.H., Graña Drummond, L.M.: On the convergence of the projected gradient method for vector optimization. Optimization 60(8–9), 1009–1021 (2011)
Fukuda, E.H., Graña Drummond, L.M.: Inexact projected gradient method for vector optimization. Comput. Optim. Appl. 54(3), 473–493 (2013)
Gass, S., Saaty, T.: The computational algorithm for the parametric objective function. Naval Res. Logist. Q. 2(1–2), 39–45 (1955)
Geoffrion, A.M.: Proper efficiency and the theory of vector maximization. J. Math. Anal. Appl. 22(3), 618–630 (1968)
Gonçalves, M.L., Lima, F.S., Prudente, L.F.: Globally convergent Newton-type methods for multiobjective optimization. Comput. Optim. Appl. 83(2), 403–434 (2022)
Gonçalves, M.L., Prudente, L.F.: On the extension of the Hager-Zhang conjugate gradient method for vector optimization. Comput. Optim. Appl. 76(3), 889–916 (2020)
Graña Drummond, L.M., Iusem, A.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28, 5–29 (2004)
Graña Drummond, L.M., Raupp, F.M.P., Svaiter, B.F.: A quadratically convergent Newton method for vector optimization. Optimization 63(5), 661–677 (2014)
Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms I: Fundamentals. Springer, Berlin (1993)
Huang, W., Absil, P.A., Gallivan, K.A.: A Riemannian symmetric rank-one trust-region method. Math. Program. 150(2), 179–216 (2015)
Huang, W., Absil, P.A., Gallivan, K.A.: A Riemannian BFGS method without differentiated retraction for nonconvex optimization problems. SIAM J. Optim. 28(1), 470–495 (2018)
Huang, W., Gallivan, K.A., Absil, P.A.: A Broyden class of quasi-Newton methods for Riemannian optimization. SIAM J. Optim. 25(3), 1660–1685 (2015)
Lu, F., Chen, C.R.: Newton-like methods for solving vector optimization problems. Appl. Anal. 93(8), 1567–1586 (2014)
Lucambio Pérez, L.R., Prudente, L.F.: Nonlinear conjugate gradient methods for vector optimization. SIAM J. Optim. 28(3), 2690–2720 (2018)
Lucambio Pérez, L.R., Prudente, L.F.: A Wolfe line search algorithm for vector optimization. ACM Trans. Math. Softw. 45(4), 1–23 (2019)
Najafi, S., Hajarian, M.: Multiobjective conjugate gradient methods on Riemannian manifolds. J. Optim. Theory Appl. 197(3), 1229–1248 (2023)
Neto, J.C., Da Silva, G.J., Ferreira, O.P., Lopes, J.O.: A subgradient method for multiobjective optimization. Comput. Optim. Appl. 54(3), 461–472 (2013)
Picheny, V., Wagner, T., Ginsbourger, D.: A benchmark of kriging-based infill criteria for noisy optimization. Struct. Multidiscip. Optim. 48(3), 607–626 (2013)
Prudente, L.F., Souza, D.R.: A quasi-Newton method with Wolfe line searches for multiobjective optimization. J. Optim. Theory Appl. 194(3), 1107–1140 (2022)
Rosenbrock, H.H.: An automatic method for finding the greatest or least value of a function. Comput. J. 3(3), 175–184 (1960)
Tanabe, H., Fukuda, E.H., Yamashita, N.: Proximal gradient methods for multiobjective optimization and their applications. Comput. Optim. Appl. 72(2), 339–361 (2019)
Udriste, C.: Convex Functions and Optimization Methods on Riemannian Manifolds. Mathematics and its Applications, vol. 297. Springer, Dordrecht (1994)
Wang, X.M., Wang, J.H., Li, C.: Convergence of inexact steepest descent algorithm for multiobjective optimizations on Riemannian manifolds without curvature constraints. J. Optim. Theory Appl. (2023). https://doi.org/10.1007/s10957-023-02235-y
Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: Empirical results. Evol. Comput. 8(2), 173–195 (2000)
Acknowledgements
We would like to express our appreciation to the editor and anonymous referees for providing valuable feedback that improved the quality of this paper.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no conflict of interest to declare.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Najafi, S., Hajarian, M. Multiobjective BFGS method for optimization on Riemannian manifolds. Comput Optim Appl 87, 337–354 (2024). https://doi.org/10.1007/s10589-023-00522-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-023-00522-y