Abstract
Recently, Gonçalves and Prudente proposed an extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization (Comput Optim Appl 76:889–916, 2020). They initially demonstrated that directly extending the Hager–Zhang method for vector optimization may not result in descent in the vector sense, even when employing an exact line search. By utilizing a sufficiently accurate line search, they subsequently introduced a self-adjusting Hager–Zhang conjugate gradient method in the vector sense. The global convergence of this new scheme was proven without requiring regular restarts or any convex assumptions. In this paper, we propose an alternative extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization that preserves its desirable scalar property, i.e., ensuring sufficiently descent without relying on any line search or convexity assumption. Furthermore, we investigate its global convergence with the Wolfe line search under mild assumptions. Finally, numerical experiments are presented to illustrate the practical behavior of our proposed method.





Similar content being viewed by others
Data availability
The codes are freely available at https://github.com/zlpjulie/vector-optimization1.git.
References
Ansary, M.A.T., Panda, G.: A modified quasi-Newton method for vector optimization problem. Optimization 64(11), 2289–2306 (2015)
Ansary, M.A.T.: A Newton-type proximal gradient method for nonlinear multi-objective optimization problems. Optim. Methods Softw. 38(3), 570–590 (2023)
Beck, A.: First-Order Methods in Optimization. SIAM, Philadelphia (2017). https://doi.org/10.1137/1.9781611974997
Bello Cruz, J.Y.: A subgradient method for vector optimization problems. SIAM J. Optim. 23(4), 2169–2182 (2013)
Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15(4), 953–970 (2005)
Burachik, R.S., Kaya, C.Y., Rizvi, M.M.: A new scalarization technique and new algorithms to generate Pareto fronts. SIAM J. Optim. 27(2), 1010–1034 (2017)
Ceng, L.C., Mordukhovich, B.S., Yao, J.C.: Hybrid approximate proximal method with auxiliary variational inequality for vector optimization. J. Optim. Theory Appl. 146(2), 267–303 (2010)
Chen, W., Yang, X.M., Zhao, Y.: Memory gradient method for multiobjective optimization. Appl. Comput. Math. 443, 127791 (2023)
Chuong, T.D.: Newton-like methods for efficient solutions in vector optimization. Comput. Optim. Appl. 54(3), 495–516 (2013)
Custódio, A.L., Madeira, J.F.A., Vaz, A.I.F., Vicente, L.N.: Direct multisearch for multiobjective optimization. SIAM J. Optim. 21(3), 1109–1140 (2011)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradients method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradients method with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)
Das, I., Dennis, J.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)
De, P., Ghosh, J.B., Wells, C.E.: On the minimization of completion time variance with a bicriteria extension. Oper. Res. 40(6), 1148–1155 (1992)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)
Eichfelder, G.: An adaptive scalarization method in multiobjective optimization. SIAM J. Optim. 19(4), 1694–1718 (2009)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Fliege, J., Vicente, L.N.: Multicriteria approach to bilevel optimization. J. Optim. Theory Appl. 131(2), 209–225 (2006)
Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)
Fliege, J., Drummond, L.M.G., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)
Fukuda, E.H., Graña Drummond, L.M.: Inexact projected gradient method for vector optimization. Comput. Optim. Appl. 54(3), 473–493 (2013)
Fletcher, R.: Practical Methods of Optimization, Unconstrained Optimization, vol. 1. Wiley, New York (1987)
Geoffrion, A.M.: Proper efficiency and the theory of vector maximization. J. Math. Anal. Appl. 22(3), 618–630 (1968)
Graña Drummond, L.M., Raupp, F.M.P., Svaiter, B.F.: A quadratically convergent Newton method for vector optimization. Optimization 63(5), 661–677 (2014)
Graña Drummond, L.M., Svaiter, B.F.: A steepest descent method for vector optimization. J. Comput. Appl. Math. 175(2), 395–414 (2005)
Graña Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28(1), 5–29 (2004)
Gonçalves, M.L.N., Lima, F.S., Prudente, L.F.: Globally convergent Newton-type methods for multiobjective optimization. Comput. Optim. Appl. 83(2), 403–434 (2022)
Gonçalves, M.L.N., Lima, F.S., Prudente, L.F.: A study of Liu–Storey conjugate gradient methods for vector optimization. Appl. Comput. Math. 425, 127099 (2022)
Gonçalves, M.L.N., Prudente, L.F.: On the extension of the Hager–Zhang conjugate gradient method for vector optimization. Comput. Optim. Appl. 76(3), 889–916 (2020)
Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)
He, Q.R., Chen, C.R., Li, S.J.: Spectral conjugate gradient method for vector optimization. Comput. Optim. Appl. (2023). https://doi.org/10.1007/s10589-023-00508-w
Hillermeier, C.: Generalized homotopy approach to multiobjective optimization. J. Optim. Theory Appl. 110(3), 557–583 (2001)
Hong, T.S., Craft, D.L., Carlsson, F.: Multicriteria optimization in intensity-modulated radiation therapy treatment planning for locally advanced cancer of the pancreatic head. Int. J. Radiat. Oncol. Biol. Phys. 72(4), 1208–1214 (2008)
Huband, S., Hingston, P., Barone, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477–506 (2006)
Jahn, J., Kirsch, A., Wagner, C.: Optimization of rod antennas of mobile phones. Math. Methods Oper. Res. 59(1), 37–51 (2004)
Jin, Y., Olhofer, M., Sendhoff, B.: Dynamic weighted aggregation for evolutionary multi-objective optimization: why does it work and how? In: Proceedings of the genetic and evolutionary computation conference, pp. 1042–1049 (2001)
Leschine, T.M., Wallenius, H., Verdini, W.A.: Interactive multiobjective analysis and assimilative capacity-based ocean disposal decisions. Eur. J. Oper. Res. 56(2), 278–289 (1992)
Lovison, A.: Singular continuation: generating piecewise linear approximations to pareto sets via global analysis. SIAM J. Optim. 21(2), 463–490 (2011)
Lu, F., Chen, C.R.: Newton-like methods for solving vector optimization problems. Appl. Anal. 93(8), 1567–1586 (2014)
Lucambio Pérez, L.R., Prudente, L.F.: A Wolfe line search algorithm for vector optimization. ACM Trans. Math. Softw. 45(4), 1–23 (2019)
Lucambio Pérez, L.R., Prudente, L.F.: Nonlinear conjugate gradient methods for vector optimization. SIAM J. Optim. 28(3), 2690–2720 (2018)
Luc, D.T.: Theory of Vector Optimization. Lectures Notes in Economics and Mathematical Systems, vol. 319. Springer, Berlin (1989)
Miglierina, E., Molho, E., Recchioni, M.C.: Box-constrained multiobjective optimization: a gradient-like method without a priori scalarization. Eur. J. Oper. Res. 188(3), 662–682 (2008)
Mita, K., Fukuda, E.H., Yamashita, N.: Nonmonotone line searches for unconstrained multiobjective optimization problems. J. Global Optim. 75, 63–90 (2019)
Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)
Preuss, M., Naujoks, B., Rudolph, G.: Pareto set and EMOA behavior for simple multimodal multiobjective functions. In: Parallel Problem Solving from Nature-PPSN IX. Springer, Berlin, Heidelberg, pp. 513–522 (2006)
Polyak, B.T.: The conjugate gradients method in extreme problems. Comput. Math. Math. Phys. 9, 94–112 (1969)
Schütze, O., Laumanns, M., Coello, C.A., Dellnitz, M., Talbi, E.G.: Convergence of stochastic search algorithms to finite size Pareto set approximations. J. Global Optim. 41(4), 559–577 (2008)
Tavana, M.: A subjective assessment of alternative mission architectures for the human exploration of Mars at NASA using multicriteria decision making. Comput. Oper. Res. 31(7), 1147–1164 (2004)
Thomann, J., Eichfelder, G.: Numerical results for the multiobjective trust region algorithm MHT. Data Brief 25, 104103 (2019)
Toint, P.L.: Test problems for partially separable optimization and results for the routine PSPMIN. Technical Report, The University of Namur, Department of Mathematics, Belgium, 1983
White, D.J.: Epsilon-dominating solutions in mean-variance portfolio analysis. Eur. J. Oper. Res. 105(3), 457–466 (1998)
Zhao, X.P., Jolaoso, L.O., Shehu, Y., Yao, J.C.: Convergence of a nonmonotone projected gradient method for nonconvex multiobjective optimization. J. Nonlinear Var. Anal. 5, 441–457 (2021)
Acknowledgements
The author wish to thank the anonymous referees for their constructive comments and suggestions on the paper, which improve this paper greatly.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
This study does not have any conflicts to disclose.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work was supported in part by NNSF(No.11961011,11761014) of China and Guangxi Science and Technology Base and Talents Special Project (No.2021AC06001) and Innovation Project of Guangxi Graduate Education (No.YCSW2022282).
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Hu, Q., Zhu, L. & Chen, Y. Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization. Comput Optim Appl 88, 217–250 (2024). https://doi.org/10.1007/s10589-023-00548-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-023-00548-2