Skip to main content
Log in

Constraint Qualifications for Karush–Kuhn–Tucker Conditions in Multiobjective Optimization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

The notion of a normal cone of a given set is paramount in optimization and variational analysis. In this work, we give a definition of a multiobjective normal cone, which is suitable for studying optimality conditions and constraint qualifications for multiobjective optimization problems. A detailed study of the properties of the multiobjective normal cone is conducted. With this tool, we were able to characterize weak and strong Karush–Kuhn–Tucker conditions by means of a Guignard-type constraint qualification. Furthermore, the computation of the multiobjective normal cone under the error bound property is provided. The important statements are illustrated by examples.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. We sometimes use the term strong Pareto to refer to a Pareto point only to emphasize the contrast with the notion of a weak Pareto point. This is not related to the notion of strong minimality defined in [25].

References

  1. Deb, K., Datta, R.: Hybrid evolutionary multiobjective optimization and analysis of machining operations. Eng. Optim. 44(6), 685–706 (2012)

    MathSciNet  Google Scholar 

  2. Deb, K.: Multi-objective Optimization Using Evolutionary Algorithms. Wiley, New York (2001)

    MATH  Google Scholar 

  3. Jahn, J.: Vector Optimization: Theory, Applications, and Extensions. Springer, Berlin (2011)

    MATH  Google Scholar 

  4. Miettinen, K.: Nonlinear Multiobjective Optimization. Springer, Berlin (1999)

    MATH  Google Scholar 

  5. Ehrgott, M.: Multicriteria Optimization. Springer, Berlin (2005)

    MATH  Google Scholar 

  6. Gopfert, A., Tammer, C., Riahi, H., Zalinescu, C.: Variational Methods in Partially Ordered Spaces. CMS Books in Mathematics. Springer, Berlin (2003)

    MATH  Google Scholar 

  7. Fliege, J., Drummond, L.M.G., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20, 602–626 (2009)

    MathSciNet  MATH  Google Scholar 

  8. Fliege, J., Vaz, A.I.F.: A method for constrained multiobjective optimization based on SQP techniques. SIAM J. Optim. 26(4), 2091–2119 (2016)

    MathSciNet  MATH  Google Scholar 

  9. Carrizo, G.A., Lotito, P.A., Maciel, M.C.: Trust region globalization strategy for the nonconvex unsconstrained multiobjective optimization problem. Math. Progr. 159, 339–369 (2016)

    MATH  Google Scholar 

  10. Qu, S., Goh, M., Liang, B.: Trust region methods for solving multiobjective optimization. Optim. Methods Softw. 28(4), 796–811 (2013)

    MathSciNet  MATH  Google Scholar 

  11. Gass, S., Saaty, T.: The computational algorithm for the parametric objective function. Nav. Res. Logist. Q. 2, 39–45 (1955)

    MathSciNet  Google Scholar 

  12. Fliege, J., Vaz, A.I.F., Vicente, L.N.: A new scalarization and numerical method for constructing weak Pareto front of multi-objective optimization problems. Optimization 60(8), 1091–1104 (2011)

    MathSciNet  Google Scholar 

  13. Chankong, V., Haimes, Y.Y.: Multiobjective Decision Making: Theory and Methodology. North-Holland, Amsterdam (1983)

    MATH  Google Scholar 

  14. Kesarwani, P., Dutta, J.: Charnes-Cooper scalarization and convex vector optimization. Optim. Lett. (2019). https://doi.org/10.1007/s11590-019-01502-0

  15. Burachik, R.S., Kaya, C.Y., Rizvi, M.M.: A new scalarization technique and new algorithms to generate Pareto fronts. SIAM J. Optim. 27(2), 1010–1034 (2017)

    MathSciNet  MATH  Google Scholar 

  16. Gerstewitz, C.: Nichtkonvexe Dualität in der Vektoroptimierung. Wiss. Zeitschr. Tech. Hochsch. Leuna-Merseburg 25, 357–364 (1983)

    MathSciNet  MATH  Google Scholar 

  17. Pascoletti, A., Serafini, P.: Scalarizing vector optimization problems. J. Optim. Theory Appl. 42, 499–524 (1984)

    MathSciNet  MATH  Google Scholar 

  18. Maciel, M.C., Santos, S.A., Sottosantos, G.N.: Regularity conditions in differentiable vector optimization revisited. J. Optim. Theory Appl. 142, 385–398 (2009)

    MathSciNet  MATH  Google Scholar 

  19. Maeda, T.: Constraint qualifications in multiobjective optimization problems: differentiable case. J. Optim. Theory Appl. 80, 483–500 (1994)

    MathSciNet  MATH  Google Scholar 

  20. Chandra, S., Dutta, J., Lalitha, C.S.: Regularity conditions and optimality in vector optimization. Numer. Funct. Anal. Optim. 25, 479–501 (2004)

    MathSciNet  MATH  Google Scholar 

  21. Bigi, G., Pappalardo, M.: Regularity conditions in vector optimization. J. Optim. Theory Appl. 102, 83–96 (1999)

    MathSciNet  MATH  Google Scholar 

  22. Rockafellar, R.T., Wets, R.: Variational Analysis. Series: Grundlehren der mathematischen Wissenschaften, vol. 317. Springer, Berlin (2009)

    MATH  Google Scholar 

  23. Murdokhovich, B.S.: Variational Analysis and Generalized Differentiation I: Basic Theory. Series: Grundlehren der mathematischen Wissenschaften, vol. 330. Springer, Berlin (2005)

    Google Scholar 

  24. Borwein, J.M., Zhu, Q.J.: Techniques of Variational Analysis. CMS Books in Mathematics. Springer, New York (2005)

    MATH  Google Scholar 

  25. Bot, R.I., Grad, S.M., Wanka, G.: Duality in Vector Optimization. Springer, Berlin (2009)

    MATH  Google Scholar 

  26. Petschke, M.: On a theorem of Arrow, Barankin, and Blackwell. SIAM J. Control Optim. 28(2), 395–401 (1990)

    MathSciNet  MATH  Google Scholar 

  27. Bigi, G.: Optimality and Lagrangian Regularity in Vector Optimzation. Ph.D. Thesis, University of Pisa, Pisa (1999)

  28. Mangasarian, O.L.: Nonlinear Programming. SIAM, Philadelphia (1994)

    MATH  Google Scholar 

  29. Andreani, R., Haeser, G., Schuverdt, M.L., Silva, P.J.S.: Two new weak constraint qualifications and applications. SIAM J. Optim. 22, 1109–1135 (2012)

    MathSciNet  MATH  Google Scholar 

  30. Andreani, R., Martínez, J., Ramos, A., Silva, P.: A cone-continuity constraint qualification and algorithmic consequences. SIAM J. Optim. 26(1), 96–110 (2016)

    MathSciNet  MATH  Google Scholar 

  31. Andreani, R., Martínez, J., Ramos, A., Silva, P.: Strict constraint qualifications and sequential optimality conditions for constrained optimization. Math. Oper. Res. C2, 693–1050 (2018)

    MathSciNet  MATH  Google Scholar 

  32. Corley, H.W.: On optimality conditions for maximizations with respect to cones. J. Optim. Theory Appl. 46(1), 67–78 (1985)

    MathSciNet  MATH  Google Scholar 

  33. Ben-Israel, A.: Motzkin’s transposition theorem, and the related theorems of Farkas, Gordan and Stiemke. In: Encyclopedia of Mathematics, Supplement III. Kluwer Academic Publishers, Dordrecht (2001)

  34. Singh, C.: Optimality conditions in multiobjective differentiable programming. J. Optim. Theory Appl. 53(1), 115–123 (1987)

    MathSciNet  MATH  Google Scholar 

  35. Aghezzaf, B., Hachimi, M.: On a gap between multiobjective optimization and scalar optimization. J. Optim. Theory Appl. 109, 431–435 (2001)

    MathSciNet  MATH  Google Scholar 

  36. Castellani, M., Pappalardo, M.: About a gap between multiobjective optimization and scalar optimization. J. Optim. Theory Appl. 109, 437–439 (2001)

    MathSciNet  MATH  Google Scholar 

  37. Wang, S.Y., Yang, F.M.: A gap between multiobjective optimization and scalar optimization. J. Optim. Theory Appl. 68, 389–391 (1991)

    MathSciNet  MATH  Google Scholar 

  38. Giorgi, G.: A note of the Guignard constraint qualification and the Guignard regularity condition in vector optimization. Appl. Math. 4(4), 734–740 (2012)

    Google Scholar 

  39. Andreani, R., Haeser, G., Ramos, A., Silva, P.: A second-order sequential optimality condition associated to the convergence of optimization algorithms. IMA J. Numer. Anal. 47, 53–63 (2017)

    MathSciNet  MATH  Google Scholar 

  40. Geoffrion, A.M.: Proper efficiency and the theory of vector optimization. J. Math. Anal. 22, 618–630 (1968)

    MATH  Google Scholar 

  41. Pang, J.S.: Error bounds in mathematical programming. Math. Program. 79, 299–332 (1997)

    MathSciNet  MATH  Google Scholar 

  42. Bolte, J., Nguyen, T.P., Peypouquet, J., Suter, B.W.: From error bounds to the complexity of first-order descent methods for convex functions. Math. Program. 165(2), 471–507 (2017)

    MathSciNet  MATH  Google Scholar 

  43. Chen, X.: Smoothing methods for nonsmooth, nonconvex minimization. Math. Program. 134, 71–99 (2012)

    MathSciNet  MATH  Google Scholar 

  44. Nesterov, Y.: Smoothing technique and its applications in semidefinite optimization. Math. Program. 110, 245–259 (2007)

    MathSciNet  MATH  Google Scholar 

  45. Beck, A., Teboulle, M.: Smoothing and first order methods: a unified framework. SIAM J. Optim. 22(2), 557–580 (2012)

    MathSciNet  MATH  Google Scholar 

  46. Gfrerer, H., Ye, J.J.: New constraint qualifications for mathematical programs with equilibrium constraints via variational analysis. SIAM J. Optim. 27, 842–865 (2017)

    MathSciNet  MATH  Google Scholar 

  47. Giorgi, G., Jimenez, B., Novo, V.: Approximate Karush Kuhn Tucker condition in multiobjective optimization. J. Optim. Theory Appl. 171, 70–89 (2016)

    MathSciNet  MATH  Google Scholar 

  48. Zhang, P., Zhang, J., Lin, G.H., Ying, X.: Constraint qualifications and proper pareto optimality conditions for multiobjective problems with equilibrium constraints. J. Optim. Theory Appl. 176, 763–782 (2018)

    MathSciNet  MATH  Google Scholar 

  49. Andreani, R., Haeser, G., Secchin, L.D., Silva, P.J.S.: New sequential optimality conditions for mathematical programs with complementarity constraints and algorithmic consequences. SIAM J. Optim. 29(4), 3201–3230 (2019)

    MathSciNet  MATH  Google Scholar 

  50. Ramos, A.: Mathematical programs with equilibrium constraints: a sequential optimality condition, new constraint qualifications and algorithmic consequences. Optim. Method. Softw. (2019). https://doi.org/10.1080/10556788.2019.1702661

  51. Ramos, A.: Two new weak constraint qualifications for mathematical programs with equilibrium constraints and applications. J. Optim. Theory Appl. 183(2), 566–591 (2019)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alberto Ramos.

Additional information

Communicated by Alexey F. Izmailov.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Haeser, G., Ramos, A. Constraint Qualifications for Karush–Kuhn–Tucker Conditions in Multiobjective Optimization. J Optim Theory Appl 187, 469–487 (2020). https://doi.org/10.1007/s10957-020-01749-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-020-01749-z

Keywords

Mathematics Subject Classification

Navigation