Skip to main content
Log in

Newton-like methods for efficient solutions in vector optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this work we study the Newton-like methods for finding efficient solutions of the vector optimization problem for a map from a finite dimensional Hilbert space X to a Banach space Y, with respect to the partial order induced by a closed, convex and pointed cone C with a nonempty interior. We present both exact and inexact versions, in which the subproblems are solved approximately, within a tolerance. Furthermore, we prove that under reasonable hypotheses, the sequence generated by our method converges to an efficient solution of this problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Bonnans, J.F., Shapiro, A.: Perturbation Analysis of Optimization Problems. Springer, New York (2000)

    MATH  Google Scholar 

  2. Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15(4), 953–970 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  3. Burachik, R., Grana Drummond, L.M., Isuem, A.N., Svaiter, B.F.: Full convergence of the steepest descent method with inexact line searches. Optimization 32, 137–146 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  4. Cambini, A., Martein, L.: Generalized convexity and optimality conditions in scalar and vector optimization. In: Hadjisavvas, N., Komlósi, S., Schaible, S. (eds.) Handbook of Generalized Convexity and Generalized Monotonicity, pp. 151–193. Springer, New York (2005)

    Google Scholar 

  5. Ceng, L.C., Yao, J.C.: Approximate proximal methods in vector optimization. Eur. J. Oper. Res. 183(1), 1–19 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  6. Ceng, L.C., Mordukhovich, B.S., Yao, J.C.: Hybrid approximate proximal method with auxiliary variational inequality for vector optimization. J. Optim. Theory Appl. 146, 267–303 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  7. Chuong, T.D., Yao, J.-C.: Steepest descent methods for critical points in vector optimization problems. Appl. Anal. (2012). doi:10.1080/00036811.2011.640629

  8. Chuong, T.D., Mordukhovich, B.S., Yao, J.-C.: Hybrid approximate proximal algorithms for efficient solutions in vector optimization. J. Nonlinear Convex Anal. 12, 257–286 (2011)

    MathSciNet  MATH  Google Scholar 

  9. Ehrgott, M.: Multicriteria Optimization. Springer, Berlin (2005)

    MATH  Google Scholar 

  10. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51, 479–494 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  11. Fliege, J., Grana Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20, 602–626 (2010)

    Article  MathSciNet  Google Scholar 

  12. Grana Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28, 5–30 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  13. Grana Drummond, L.M., Svaiter, B.F.: A steepest descent method for vector optimization. J. Comput. Appl. Math. 175, 395–414 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  14. Grana Drummond, L.M., Maculan, N., Svaiter, B.F.: On the choice of parameters for the weighting method in vector optimization. Math. Program., Ser. B 111(1–2), 201–216 (2008)

    MathSciNet  MATH  Google Scholar 

  15. Hiriart-Urruty, J.-B.: New concepts in nondifferentiable programming. Bull. Soc. Math. Fr., Mém. 60, 57–85 (1979)

    MathSciNet  MATH  Google Scholar 

  16. Hiriart-Urruty, J.-B.: Tangent cones, generalized gradients and mathematical programming in Banach spaces. Math. Oper. Res. 4, 79–97 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  17. Jahn, J.: Vector Optimization. Theory, Applications, and Extensions. Springer, Berlin (2004)

    MATH  Google Scholar 

  18. Luc, D.T.: Theory of Vector Optimization. Lecture Notes in Economics and Mathematical Systems, vol. 319. Springer, Berlin (1989)

    Book  Google Scholar 

  19. Luc, D.T.: Generalized convexity in vector optimization. In: Hadjisavvas, N., Komlósi, S., Schaible, S. (eds.) Handbook of Generalized Convexity and Generalized Monotonicity, pp. 195–236. Springer, New York (2005)

    Google Scholar 

  20. Rockafellar, R.T., Uryasev, S., Zabarankin, M.: Generalized deviations in risk analysis. Finance Stoch. 10, 51–74 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  21. Zaffaroni, A.: Degrees of efficiency and degrees of minimality. SIAM J. Control Optim. 42, 1071–1086 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  22. Zalinescu, C.: Convex Analysis in General Vector Spaces. World Scientific, Singapore (2002)

    Book  MATH  Google Scholar 

Download references

Acknowledgements

The author would like to thank the referees for valuable comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thai Doan Chuong.

Additional information

This work was supported by a research grant from the National Foundation for Science and Technology Development of Vietnam (NAFOSTED).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chuong, T.D. Newton-like methods for efficient solutions in vector optimization. Comput Optim Appl 54, 495–516 (2013). https://doi.org/10.1007/s10589-012-9495-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-012-9495-6

Keywords

Navigation