Skip to main content
Log in

Nonmonotone line searches for unconstrained multiobjective optimization problems

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

In the last two decades, many descent methods for multiobjective optimization problems were proposed. In particular, the steepest descent and the Newton methods were studied for the unconstrained case. In both methods, the search directions are computed by solving convex subproblems, and the stepsizes are obtained by an Armijo-type line search. As a consequence, the objective function values decrease at each iteration of the algorithms. In this work, we consider nonmonotone line searches, i.e., we allow the increase of objective function values in some iterations. Two well-known types of nonmonotone line searches are considered here: the one that takes the maximum of recent function values, and the one that takes their average. We also propose a new nonmonotone technique specifically for multiobjective problems. Under reasonable assumptions, we prove that every accumulation point of the sequence produced by the nonmonotone version of the steepest descent and Newton methods is Pareto critical. Moreover, we present some numerical experiments, showing that the nonmonotone technique is also efficient in the multiobjective case.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. This is actually a modified version of [1] that can be found in [4].

  2. In the original version, either the bounds L and U, or the variable n can be modified.

  3. It is an adaptation of a single-objective optimization problem to the multiobjective setting. Since the problem is originally unconstrained, we also added some bound constraints.

References

  1. Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)

    Article  MathSciNet  Google Scholar 

  2. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  3. Fazzio, N., Schuverdt, M.L.: Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems (2018) (submitted)

  4. Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)

    Article  MathSciNet  Google Scholar 

  5. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)

    Article  MathSciNet  Google Scholar 

  6. Fukuda, E.H., Graña Drummond, L.M.: A survey on multiobjective descent methods. Pesquisa Oper. 34(3), 585–620 (2014)

    Article  Google Scholar 

  7. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)

    Article  MathSciNet  Google Scholar 

  8. Jin, Y., Olhofer, M., Sendhoff, B.: Dynamic weighted aggregation for evolutionary multi-objective optimization: Why does it work and how? In: GECCO’01 Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, pp. 1042–1049 (2001)

  9. Kim, I.Y., de Weck, O.L.: Adaptive weighted-sum method for bi-objective optimization: Pareto front generation. Struct. Multidiscip. Optim. 29(2), 149–158 (2005)

    Article  Google Scholar 

  10. Laumanns, M., Thiele, L., Deb, K., Zitzler, E.: Combining convergence and diversity in evolutionary multiobjective optimization. Evolut. Comput. 10(3), 263–282 (2002)

    Article  Google Scholar 

  11. Luc, D.T.: Scalarization of vector optimization problems. J. Optim. Theory Appl. 55(1), 85–102 (1987)

    Article  MathSciNet  Google Scholar 

  12. Mita, K.: Nonmonotone line search in multiobjective settings (in Japanese). Undergraduate research, Kyoto University(2017)

  13. Mita, K., Fukuda, E.H., Yamashita, N.: On using nonmonotone line search techniques in steepest descent methods for multiobjective optimization (in Japanese). In: Proceedings of the 61st Annual Conference of the Institute of Systems, Control and Information Engineers (2017)

  14. Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM T. Math. Softw. 7(1), 17–41 (1981)

    Article  MathSciNet  Google Scholar 

  15. Nocedal, J.: Updating quasi-newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)

    Article  MathSciNet  Google Scholar 

  16. Ogata, Y., Saito, Y., Tanaka, T., Yamada, S.: Sublinear scalarization methods for sets with respect to set-relations. Linear Nonlinear Anal. 3(1), 121–132 (2017)

    MathSciNet  MATH  Google Scholar 

  17. Qu, S., Ji, Y., Jiang, J., Zhang, Q.: Nonmonotone gradient methods for vector optimization with a portfolio optimization application. Eur. J. Oper. Res. 263, 356–366 (2017)

    Article  MathSciNet  Google Scholar 

  18. Stadler, W., Dauer, J.: Multicriteria optimization in engineering: a tutorial and survey. In: Kamat, M.P. (ed.) Progress in Aeronautics and Astronautics: Structural Optimization: Status and Promise, vol. 150, pp. 209–249. American Institute of Aeronautics and Astronautics, Reston (1992)

    Google Scholar 

  19. Toint, Ph.L.: Test problems for partially separable optimization and results for the routine PSPMIN. Tech. Rep. 83/4, Department of Mathematics, University of Namur, Brussels (1983)

  20. Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)

    Article  MathSciNet  Google Scholar 

  21. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evolut. Comput. 8(2), 173–195 (2000)

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank the anonymous referees for their suggestions, which improved the original version of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ellen H. Fukuda.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by the Kyoto University Foundation, and the Grant-in-Aid for Scientific Research (C) (17K00032 and 19K11840) from Japan Society for the Promotion of Science.

Appendix A

Appendix A

Here, we list the test problems used in Sect. 7. For each problem, we state the original reference, the number of variables n, the number of objective functions m, the convexity property, the objective functions, and the bounds L and U of the box constraints.

  1. 1.

    Das and Dennis (DD1) [1]: \(n=5\), \(m=2\), nonconvex,Footnote 1\(^{,}\)Footnote 2

    $$\begin{aligned} F_{1}(x)&=x_{1}^{2}+x_{2}^{2}+x_{3}^{2}+x_{4}^{2}+x_{5}^{2},\\ F_{2}(x)&=3x_{1}+2x_{2}-\frac{x_{3}}{3}+0.01(x_{4}-x_{5})^{3}, \end{aligned}$$

    \(L=(-20, \ldots , -20)^{\top }\), and \(U=(20, \ldots , 20)^{\top }\).

  2. 2.

    Fliege, Graña Drummond and Svaiter (FDS) [4]: \(n=10\), \(m=3\), convex,\(^2\)

    $$\begin{aligned} F_{1}(x)&=\frac{1}{n^{2}}\sum _{i=1}^{n}i(x_{i}-i)^{4},\\ F_{2}(x)&=\exp \left( \sum _{i=1}^{n}\frac{x_{i}}{n}\right) +\Vert x\Vert _{2}^{2},\\ F_{3}(x)&=\frac{1}{n(n+1)}\sum _{i=1}^{n}i(n-i+1)e^{-x_{i}}, \end{aligned}$$

    \(L=(-2, \ldots , -2)^{\top }\), and \(U=(2,\ldots , 2)^{\top }\).

  3. 3.

    Jin, Olhofer and Sendhoff (JOS1) [8]: \(n=5\), \(m=2\), quadratic convex,\(^2\)

    $$\begin{aligned} F_{1}(x)&=\frac{1}{n}\sum _{i=1}^{n}x_{i}^{2},\\ F_{2}(x)&=\frac{1}{n}\sum _{i=1}^{n}(x_{i}-2)^{2}, \end{aligned}$$

    \(L=(-2,\ldots , -2)^{\top }\), and \(U=(2,\ldots , 2)^{\top }\).

  4. 4.

    Kim and Weck (KW2) [9]: \(n=2\), \(m=2\), nonconvex,

    $$\begin{aligned} F_{1}(x) =&-3(1-x_{1})^{2} \exp (-x_{1}^{2}-(x_{2}+1)^{2})\\&+ 10\left( \frac{x_{1}}{5}-x_{1}^{3}-x_{2}^{5}\right) \exp (-x_{1}^{2}-x_{2}^{2})\\&+3 \exp (-(x_{1}+2)^{2}-x_{2}^{2})-0.5(2x_{1}+x_{2}),\\ F_{2}(x) =&-3(1+x_{2})^{2} \exp (-x_{2}^{2}-(1-x_{1})^{2}) \\&+ 10\left( -\frac{x_{2}}{5}+x_{2}^{3}+x_{1}^{5}\right) \exp (-x_{1}^{2}-x_{2}^{2})\\&+ 3 \exp (-(2-x_{2})^{2}-x_{1}^{2}), \end{aligned}$$

    \(L=(-3,-3)^{\top }\), and \(U=(3,3)^{\top }\).

  5. 5.

    Stadler and J. Dauer (SD) [18]: \(n=4\), \(m=2\), convex,

    $$\begin{aligned} F_{1}(x)&=2x_{1}+\sqrt{2}x_{2}+\sqrt{2}x_{3}+x_{4},\\ F_{2}(x)&=\frac{2}{x_{1}}+\frac{2\sqrt{2}}{x_{2}}+\frac{2\sqrt{2}}{x_{3}}+\frac{2}{x_{4}}, \end{aligned}$$

    \(L=\left( 1, \sqrt{2}, \sqrt{2}, 1\right) ^{\top }\), and \(U=(3, 3, 3, 3)^{\top }\).

  6. 6.

    Zitzler, Deb and Thiele (ZDT1) [21]: \(n=30\), \(m=2\), convex,\(^2\)

    $$\begin{aligned} F_{1}(x)&=x_{1},\\ F_{2}(x)&=g(x)\left( 1-\sqrt{\frac{x_{1}}{g(x)}}\right) , \end{aligned}$$

    with \(g(x)=1+9\sum _{i=2}^{n} x_{i}/(n-1)\), \(L=(0,\ldots , 0)^{\top }\), and \(U=\left( \frac{1}{100},\ldots , \frac{1}{100}\right) ^{\top }\).

  7. 7.

    Zitzler, Deb and Thiele (ZDT4) [21]: \(n=10\), \(m=2\), nonconvex,\(^2\)

    $$\begin{aligned} F_{1}(x)&=x_{1},\\ F_{2}(x)&=g(x)\left( 1-\sqrt{\frac{x_{1}}{g(x)}}\right) , \end{aligned}$$

    with \(g(x)=1+10(n-1)+\sum _{i=2}^{n}\left( x_{i}^{2}-10\cos (4\pi x_{i})\right) \), \(L=\left( \frac{1}{100}, -5,\ldots , -5\right) ^{\top }\), and \(U=(1, 5,\ldots , 5)^{\top }\).

  8. 8.

    Toint (TOI4) [19, Problem 4]: \(n=4\), \(m=2\), convex,Footnote 3

    $$\begin{aligned} F_1(x)&=x_1^2+x_2^2+1,\\ F_2(x)&=0.5\left( (x_1-x_2)^2+(x_3-x_4)^2\right) +1, \end{aligned}$$

    \(L=(-2,\ldots , -2)^{\top }\), and \(U=(5,\ldots , 5)^{\top }\).

  9. 9.

    TRIDIA [19, Problem 8]: \(n=3\), \(m=3\), convex,\(^{22}\)

    $$\begin{aligned} F_1(x)&=(2x_{1}-1)^{2}, \\ F_2(x)&=2(2x_{1}-x_{2})^{2}, \\ F_3(x)&=3(2x_{2}-x_{3})^{2}, \end{aligned}$$

    \(L=(-1, -1, -1)^{\top }\), and \(U=(1, 1, 1)^{\top }\).

  10. 10.

    Shifted TRIDIA [19, Problem 9]: \(n=4\), \(m=4\), nonconvex,\(^{23}\)

    $$\begin{aligned} F_{1}(x)&=(2x_{1}-1)^{2}+x_{2}^{2},\\ F_{i}(x)&=i(2x_{i-1}-x_{i})^{2}-(i-1)x_{i-1}^{2}+ix_{i}^{2} \quad i=2,3,\\ F_{4}(x)&=4(2x_{3}-x_{4})^{2}-3x_{3}^{2}, \end{aligned}$$

    \(L=(-1,\ldots , -1)^{\top }\), and \(U=(1,\ldots , 1)^{\top }\).

  11. 11.

    Rosenbrock [19, Problem 10]: \(n=4\), \(m=3\), nonconvex,\(^{23}\)

    $$\begin{aligned} F_{i}(x)&=100(x_{i+1}-x_{i}^{2})^{2}+(x_{i+1}-1)^{2}, \quad i=1,2,3, \end{aligned}$$

    \(L=(-2,\ldots , -2)^{\top }\), and \(U=(2,\ldots , 2)^{\top }\).

  12. 12.

    Helical valley [14, Problem (7)]: \(n=3\), \(m=3\), nonconvex,\(^3\)

    $$\begin{aligned} F_{1}(x)&=\left\{ \begin{array}{ll} \displaystyle { \left[ 10\left( x_3-\frac{5}{\pi }\arctan \left( \frac{x_2}{x_1}\right) \right) \right] ^2,} &{} \text{ if }\;x_1>0,\\ \displaystyle { \left[ 10\left( x_3-\frac{5}{\pi }\arctan \left( \frac{x_2}{x_1}\right) -5\right) \right] ^2,} &{} \text{ if }\;x_1<0 \end{array}\right. \\ F_{2}(x)&=\left( 10\left( (x_1^2+x_2^2)^{1/2}-1\right) \right) ^2,\\ F_{3}(x)&=x_3^2, \end{aligned}$$

    \(L=(-2, -2, -2)^{\top }\), and \(U=(2, 2, 2)^{\top }\).

  13. 13.

    Gaussian [14, Problem (9)]: \(n=3\), \(m=15\), nonconvex,\(^3\)

    $$\begin{aligned} F_{i}(x)&=x_1\exp \left( \frac{-x_2(t_i-x_3)^2}{2}\right) -y_i, \end{aligned}$$

    where \(t_i=(8-i)/2\), \(i=1,\ldots ,m\) and \(y_i\) is given as

    i

    1,15

    2,14

    3,13

    4,12

    5,11

    6,10

    7,9

    8

    \(y_i\)

    0.0009

    0.0044

    0.0175

    0.0540

    0.1295

    0.2420

    0.3521

    0.3989

    \(L=(-2, -2, -2)^{\top }\), and \(U=(2, -2, 2)^{\top }\).

  14. 14.

    Brown and Dennis [14, Problem (16)]: \(n=4\), \(m=5\), nonconvex,\(^3\)

    $$\begin{aligned} F_i(x)&=\left( x_1+t_i x_2-e^{t_i}\right) ^2+\left( x_3+x_4\sin (t_i)-\cos (t_i)\right) ^2, \end{aligned}$$

    where \(t_i=i/5\), \(L=(-25, -5, -5, -1)^{\top }\), and \(U=(25, 5, 5, 1)^{\top }\).

  15. 15.

    Trigonometric [14, Problem (26)]: \(n=4\), \(m=4\), nonconvex,\(^{23}\)

    $$\begin{aligned} F_i(x)&=\left( n-\sum _{j=1}^{n}\cos x_j+i\left( 1-\cos x_i\right) -\sin x_i\right) ^2, \quad i=1,\ldots ,4, \end{aligned}$$

    \(L=(-1,\ldots , -1)^{\top }\), and \(U=(1,\ldots , 1)^{\top }\).

  16. 16.

    Linear function – rank 1 [14, Problem (33)]: \(n=10\), \(m=4\), convex,\(^{23}\)

    $$\begin{aligned} F_i(x)&=\left( i\left( \sum _{j=1}^{n}jx_j\right) -1\right) ^2, \quad i=1,\ldots ,4, \end{aligned}$$

    \(L=(-1,\ldots , -1)^{\top }\), and \(U=(1,\ldots , 1)^{\top }\).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mita, K., Fukuda, E.H. & Yamashita, N. Nonmonotone line searches for unconstrained multiobjective optimization problems. J Glob Optim 75, 63–90 (2019). https://doi.org/10.1007/s10898-019-00802-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-019-00802-0

Keywords

Navigation