Skip to main content
Log in

On rigorous upper bounds to a global optimum

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

In branch and bound algorithms in constrained global optimization, a sharp upper bound on the global optimum is important for the overall efficiency of the branch and bound process. Software to find local optimizers, using floating point arithmetic, often computes an approximately feasible point close to an actual global optimizer. Not mathematically rigorous algorithms can simply evaluate the objective at such points to obtain approximate upper bounds. However, such points may actually be slightly infeasible, and the corresponding objective values may be slightly smaller than the global optimum. A consequence is that actual optimizers are occasionally missed, while the algorithm returns an approximate optimum and corresponding approximate optimizer that is occasionally far away from an actual global optimizer. In mathematically rigorous algorithms, objective values are accepted as upper bounds only if the point of evaluation is proven to be feasible. Such computational proofs of feasibility have been weak points in mathematically rigorous algorithms. This paper first reviews previously proposed automatic proofs of feasibility, then proposes an alternative technique. The alternative technique is tried on a test set that caused trouble for previous techniques, and is also employed in a mathematically rigorous branch and bound algorithm on that test set.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. Assuming, of course, there are no programming errors.

References

  1. Adjiman, C.S., Dallwig, S., Floudas, C.A., Neumaier, A.: A global optimization method, \(\alpha \)BB, for general twice-differentiable constrained NLPs I. Theoretical advances. Comput. Chem. Eng. 22(9), 1137–1158 (1998)

    Article  Google Scholar 

  2. Fletcher, R., Leyffer, S.: Nonlinear programming without a penalty function. Math. Prog. 91(2), 239–269 (2002)

    Article  Google Scholar 

  3. Floudas, C.: Deterministic global optimization: advances and challenges. In: Plenary Lecture, First World Congress on Global Optimization in Engineering and Sciences, WCGO (2009)

  4. Kearfott, R.B.: Rigorous global search: continuous problems. Number 13 in nonconvex optimization and its applications. Kluwer, Dordrecht (1996)

    Book  Google Scholar 

  5. Kearfott, R.B.: On proving existence of feasible points in equality constrained optimization problems. Math. Program. 83(1), 89–100 (1998)

    Google Scholar 

  6. Kearfott, R.B.: GlobSol user guide. Optim. Methods Softw. 24(4–5), 687–708 (2009)

    Article  Google Scholar 

  7. Kearfott, R.B.: Interval computations, rigour and non-rigour in deterministic continuous global optimization. Optim. Methods Softw. 26(2), 259–279 (2011)

    Article  Google Scholar 

  8. Kearfott, R.B., Castille, J.M., Tyagi, G.: Assessment of a non-adaptive deterministic global optimization algorithm for problems with low-dimensional non-convex subspaces, accepted for publication in Optim. Methods Softw. 29(2), 430–441 (2014)

    Google Scholar 

  9. Kearfott, R.B., Muniswamy, S., Wang, Y., Li, X., Wang, Q.: On smooth reformulations and direct non-smooth computations in global optimization for minimax problems. J. Global Optim. 57(4), 1091–1111 (2013)

    Google Scholar 

  10. Neumaier, A.: COCONUT Web page, 2001–2003. http://www.mat.univie.ac.at/~neum/glopt/coconut

  11. Ninin, J., Messine, F., Hansen, P.: A reliable affine relaxation method for global optimization, Rapport de recherche, RT-APO-10-05, IRIT. ftp://ftp.irit.fr/IRIT/APO/RT-APO-10-05.pdf (2010). Accessed March 2010

  12. Ninin, J., Messine, F.: A metaheuristic methodology based on the limitation of the memory of interval branch and bound algorithms. J. Global Optim. 50(4), 629–644 (2011)

    Article  Google Scholar 

  13. Sahinidis, N.V.: BARON: A general purpose global optimization software package. J. Global Optim. 8(2), 201–205 (1996)

    Article  Google Scholar 

  14. Shcherbina, O., Neumaier, A., Sam-Haroud, D., Vu, X.-H., Nguyen, T.-V.: Benchmarking global optimization and constraint satisfaction codes. In: Bliek, C., Jermann, C., Neumaier, A. (eds.) COCOS, Volume 2861 of Lecture Notes in Computer Science, pp. 211–222. Springer, Berlin (2003)

    Google Scholar 

  15. Wächter, A.: Homepage of IPOPT, 2002. https://projects.coin-or.org/Ipopt

  16. Wächter, A., Biegler, L.T.: Line search filter methods for nonlinear programming: local convergence. SIAM J. Optim. 16(1), 32–48 (2005)

    Article  Google Scholar 

  17. Wächter, A., Biegler, L.T.: Line search filter methods for nonlinear programming: motivation and global convergence. SIAM J. Optim. 16(1), 1–31 (2005)

    Article  Google Scholar 

Download references

Acknowledgments

I wish to thank the referees for their numerous comments that substantially improved the content of this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ralph Baker Kearfott.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kearfott, R.B. On rigorous upper bounds to a global optimum. J Glob Optim 59, 459–476 (2014). https://doi.org/10.1007/s10898-014-0173-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-014-0173-3

Keywords

Navigation