Skip to main content

Advertisement

Log in

First order rejection tests for multiple-objective optimization

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

Three rejection tests for multi-objective optimization problems based on first order optimality conditions are proposed. These tests can certify that a box does not contain any local minimizer, and thus it can be excluded from the search process. They generalize previously proposed rejection tests in several regards: Their scope include inequality and equality constrained smooth or nonsmooth multiple objective problems. Reported experiments show that they allow quite efficiently removing the cluster effect in mono-objective and multi-objective problems, which is one of the key issues in continuous global deterministic optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. In the case of vector valued functions \(f=(f_1, \ldots , f_m),\,\partial f\) is a matrix whose columns are \(\partial f_i\), so \(\partial f(x_*) \, \lambda =\sum \nolimits _i\lambda _i\partial f_i(x_*)\).

  2. Clusters of small boxes appear around local or global minimizers due to excessive splitting and failure to remove the resulting boxes because too close to these minimizers. This behavior is generic and one of the main issues in deterministic global optimization.

  3. These inequality constraints are only potentially active because interval extensions are generally pessimistic. All rejection tests proposed remain correct when a potentially active constraint is actually inactive, although they are more efficient as inactive constraints are more accurately detected.

  4. Typically, an approximate generalized inverse of the midpoint of \(\mathbf{G}_*(\mathbf{x})\).

  5. Note that \(\varSigma (\mathbf{A},Ce)\) is preconditioned so the Gauss–Seidel iteration needs solving only diagonal entries of \(\mathbf{A}\). On the other hand, \(\varSigma (\mathbf{G}_*,e)\) is not preconditioned so all entries of \(\mathbf{G}_*\) need to be solved. This can be efficiently performed using inner subtraction.

  6. Although not noted in [31], the angle between two gradients interval evaluations \(\mathbf{g}_1\) and \(\mathbf{g}_2\) can be proved not to contain \(\pi \) simply by checking that the scalar product \(\mathbf{g}_1\,\mathbf{g}_2\) does not intersect \(||\mathbf{g}_1||\,||\mathbf{g}_2||\). This is sufficient for rejecting the box, and easily computed for arbitrary dimensions.

  7. The asymptotic analyses of the cluster effect provided in [6], in the context of unconstrained optimization, or in [34], in the context of a system of equations, lead to different models that do not hold here. In particular both [6] and [34] consider some pessimistic interval evaluations, while this academic problem suffers from the cluster effect in spite of exact interval evaluations of its objective function and constraints.

  8. GloptLab implements here a branch and prune algorithm based on constrain propagation on DAGs [35, 40].

    Fig. 2
    figure 2

    The solution sets for \(n=2\) of the constraints from Sects. 5.1 and 5.1 respectively (in the left graphic, the dashed rectangle represent the cylinder where a cluster effect is expected)

  9. Experiments non reported here have shown that the constraint propagation can remove the cluster effect when \(n=2\), although the closer to the optimum the slower the convergence of the propagation, converging to infinitely slow convergence (which requires very expensive constraint propagation). This is generic in two variables, but not in higher dimensions where the constraint propagation is not able anymore to remove the cluster effect.

References

  1. Alefeld, G., Herzberger, J.: Introduction to interval computations. Comput. Sci. Appl. Math. (1974)

  2. Barichard, V., Deleau, H., Hao, J.K., Saubion, F.: A hybrid evolutionary algorithm for CSP. In: Artificial Evolution, volume 2936 of LNCS, pp. 79–90. Springer, Berlin (2004)

  3. Barichard, V., Hao, J.K.: A population and interval constraint propagation algorithm. In: Evolutionary Multi-Criterion Optimization, volume 2632 of LNCS, pp. 72–72. Springer, Berlin (2003)

  4. Clarke, F.H.: Optimization and Nonsmooth Analysis. Society for Industrial and Applied Mathematics, Philadephia (1990)

    Book  Google Scholar 

  5. Domes, F.: Gloptlab: a configurable framework for the rigorous global solution of quadratic constraint satisfaction problems. Optim. Methods Softw. 24(4–5), 727–747 (2009)

    Article  Google Scholar 

  6. Du, K., Kearfott, R.B.: The cluster problem in multivariate global optimization. J. Glob. Optim. 5, 253–265 (1994)

    Article  Google Scholar 

  7. Duran, M.A., Grossman, I.E.: An outer-approximation algorithm for a class of mixed-integer nonlinear programs. Math. Program. 36, 307–339 (1986)

    Article  Google Scholar 

  8. Garloff, J.: Interval gaussian elimination with pivot tightening. SIAM J. Matrix Anal. Appl. 30(4), 1761–1772 (2009)

    Article  Google Scholar 

  9. Goldberg, D.: What every computer scientist should know about floating-point arithmetic. Comput. Surv. 23(1), 5–48 (1991)

    Article  Google Scholar 

  10. Goualard, F.: GAOL 3.1.1: Not Just Another Interval Arithmetic Library, 4th edn. Laboratoire d’Informatique de Nantes-Atlantique (2006)

  11. Granvilliers, L., Goldsztejn, A.: A branch-and-bound algorithm for unconstrained global optimization. In: Proceedings of the 14th GAMM-IMACS International Symposium on Scientific Computing, Computer Arithmetic and Validated Numerics (SCAN) (2010)

  12. Hansen, E.: Global Optimization Using Interval Analysis, 2nd edn. Marcel Dekker, NY (1992)

    Google Scholar 

  13. Hansen, E.R., Walster, G.W.: Bounds for Lagrange multipliers and optimal points. Comput. Math. Appl. 25(1011), 59–69 (1993)

    Article  Google Scholar 

  14. Ichida, K., Fujii, Y.: Multicriterion optimization using interval analysis. Computing 44, 47–57 (1990)

    Article  Google Scholar 

  15. Jahn, J.: Multiobjective search algorithm with subdivision technique. Comput. Optim. Appl. 35(2), 161–175 (2006)

    Article  Google Scholar 

  16. Jaulin, L., Kieffer, M., Didrit, O., Walter, E.: Applied Interval Analysis with Examples in Parameter and State Estimation, Robust Control and Robotics. Springer, Berlin (2001)

    Google Scholar 

  17. Kearfott, R.B.: Interval computations: introduction, uses, and resources. Euromath Bull. 2(1), 95–112 (1996)

    Google Scholar 

  18. Kearfott, R.B.: An interval branch and bound algorithm for bound constrained optimization problems. J. Glob. Optim. 2, 259–280 (1992)

    Article  Google Scholar 

  19. Kearfott, R.B.: Interval computations, rigour and non-rigour in deterministic continuous global optimization. Optim. Methods Softw. 26(2), 259–279 (2011)

    Article  Google Scholar 

  20. Knueppel, O.: PROFIL/BIAS—a fast interval library. Computing 53(3–4), 277–287 (1994)

    Article  Google Scholar 

  21. Kubica, BJ., Niewiadomska-Szynkiewicz, E.: An improved interval global optimization method and its application to price management problem. In: Applied Parallel Computing. State of the Art in Scientific Computing, volume 4699 of LNCS, pp. 1055–1064. Springer, Berlin (2007)

  22. Kubica, B.J., Wozniak, A.: Interval methods for computing the pareto-front of a multicriterial problem. In: Parallel Processing and Applied Mathematics, volume 4967 of LNCS, pp. 1382–1391. Springer, Berlin (2008)

  23. Kubica, B.J., Wozniak, A.: Using the second-order information in pareto-set Computations of a multi-criteria problem. In: Applied Parallel and Scientific Computing, volume 7134 of LNCS, pp. 137–147. Springer, Berlin (2012)

  24. Lin, Y., Stadtherr, M.A.: Encyclopedia of Optimization. In: Floudas, C.A., Pardalos, P.M. (eds.) LP Strategy for Interval-Newton Method in Deterministic Global Optimization, pp. 1937–1943. Springer, USA (2009)

    Google Scholar 

  25. Makino, K., Berz, M.: Automatic differentiation of algorithms. In: Corliss, G., Faure, C., Griewank, A., Hascoët, L., Naumann, U. (eds.) New Applications of Taylor Model Methods, Automatic Differentiation of Algorithms. Springer, New York (2002)

    Google Scholar 

  26. Miettinen, K.M.: Nonlinear Multiobjective Optimization. Kluwer, Dordrecht (1998)

    Book  Google Scholar 

  27. Moore, R.: Interval Analysis. Prentice-Hall, Englewood Cliffs, NJ (1966)

    Google Scholar 

  28. Neumaier, A.: Interval Methods for Systems of Equations. Cambridge Univ Press, Cambridge (1990)

    Google Scholar 

  29. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, Berlin (2006)

    Google Scholar 

  30. Rohn, J.: Forty necessary and sufficient conditions for regularity of interval matrices: a survey. Electron. J. Linear Algebra 18, 500–512 (2009)

    Google Scholar 

  31. Ruetsch, G.R.: An interval algorithm for multi-objective optimization. Struct. Multidiscip. Optim. 30, 27–37 (2005)

    Article  Google Scholar 

  32. Ruetsch, G.R.: Using interval techniques of direct comparison and differential formulation to solve a multi-objective optimization problem (patent US-7742902), (2010)

  33. Rump, S.M.: Developments in reliable computing. In: Csendes, T. (ed.) INTLAB—Interval Laboratory. Kluwer, Dordrecht (1999)

    Google Scholar 

  34. Schichl, H., Neumaier, A.: Exclusion regions for systems of equations. SIAM J. Numer. Anal. 42(1), 383–408 (2004)

    Article  Google Scholar 

  35. Schichl, H., Neumaier, A.: Interval analysis on directed acyclic graphs for global optimization. J. Glob. Optim. 33(4), 541–562 (2005)

    Article  Google Scholar 

  36. Schichl, H., Neumaier, A.: Transposition theorems and qualification-free optimality conditions. SIAM J. Optim. 17(4), 1035–1055 (2006)

    Article  Google Scholar 

  37. Shcherbina, O., Neumaier, A., Sam-Haroud D., Vu, X.-H., Nguyen, T.-V.: Benchmarking global optimization and constraint satisfaction codes. In: Bliek, C., Jermann, C., Neumaier, A. (eds.) Global Optimization and Constraint Satisfaction, volume 2861 of Lecture Notes in Computer Science, pp. 211–222. Springer, Berlin (2003)

  38. Soares, G.L., Parreiras, R.O., Jaulin, L., Vasconcelos, J.A., Maia, C.A.: Interval robust multi-objective algorithm. Nonlinear Anal. Theor Methods Appl. 71(12), 1818–1825 (2009)

    Article  Google Scholar 

  39. Tóth, B.G., Hernndez, J.F.: Interval Methods for Single and Bi-objective Optimization Problems Applied to Competitive Facility Location Models. LAP Lambert Academic Publishing (2010)

  40. Vu, X.-H., Schichl, H., Sam-Haroud, D.: Interval propagation and search on directed acyclic graphs for numerical constraint solving. J. Glob. Optim. 45(4), 499–531 (2009)

    Article  Google Scholar 

  41. Wolfram Research Inc., Mathematica. Champaign, Illinois, Version 7.0 (2008)

Download references

Acknowledgments

This work was partially funded by the Région Pays de la Loire of France, and the National Institute of Informatics of Japan. The authors are much grateful to Christophe Jermann for his valuable help in experimenting and analyzing the cluster effect of the presented academic examples.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexandre Goldsztejn.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Goldsztejn, A., Domes, F. & Chevalier, B. First order rejection tests for multiple-objective optimization. J Glob Optim 58, 653–672 (2014). https://doi.org/10.1007/s10898-013-0066-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-013-0066-x

Keywords

Navigation