Skip to main content
Log in

A New Class of Improved Convex Underestimators for Twice Continuously Differentiable Constrained NLPs

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

We present a new class of convex underestimators for arbitrarily nonconvex and twice continuously differentiable functions. The underestimators are derived by augmenting the original nonconvex function by a nonlinear relaxation function. The relaxation function is a separable convex function, that involves the sum of univariate parametric exponential functions. An efficient procedure that finds the appropriate values for those parameters is developed. This procedure uses interval arithmetic extensively in order to verify whether the new underestimator is convex. For arbitrarily nonconvex functions it is shown that these convex underestimators are tighter than those generated by the αBB method. Computational studies complemented with geometrical interpretations demonstrate the potential benefits of the proposed improved convex underestimators.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Adjiman, C.S., Androulakis, I. and Floudas, C.A. (1998a), A global optimization method, αBB, for general twice-differentiable constrained NLPs-II. Implementation and computational results, Computers and Chemical Engineering, 22, 1159–1179.

    Google Scholar 

  • Adjiman, C.S., Dallwig, S., Floudas, C.A. and Neumaier, A. (1998b), A global optimization method, αBB, for general twice-differentiable constrained NLPs-I: Theoretical aspects, Computers and Chemical Engineering, 22(9), 1137–1158.

    Google Scholar 

  • Akrotirianakis, I.G. and Floudas, C.A. (2004), Computational Experience with a New Class of Convex Underestimators: Box-constrained NLP problems, Submitted for publication, Journal of Global Optimization.

  • Al-Khayyal, F.H. and Falk, J.E. (1983), Jointly constrained biconvex programming, Mathematics of Operations Research, 8, 523.

    Google Scholar 

  • Boggs, P.T. and Tolle, J.W. (1995), Sequential quadratic programming, Acta Numerica, 4, 1–52.

    Google Scholar 

  • Dixon, L.C.W. and Szego, G.P. (1975), Towards global optimization, In: Proceedings of a Workshop at the University of Cagliari, Italy: North-Holland.

    Google Scholar 

  • Floudas, C.A. (2000), Deterministic Global Optimization, Theory, Methods and Applications, Kluwer Academic Publishers.

  • Floudas, C.A., Pardalos, P.M., Adjiman C.S., Esposito W.R., Gumus Z.H., Harding S.T., Klepeis J.L., Meyer C.A. and Schweiger, C.A. (1999), Handbook of Test Problems in Local and Global Optimization, Dordrecht, The Netherlands, Kluwer Academic Publishers.

    Google Scholar 

  • Gelatt, C.D., Kirkpatric, S. and Vecchi, M.P. (1983), Optimization by simulated annealing, Science, 220, 671.

    MathSciNet  Google Scholar 

  • Goldberg, D.E. (1987), Genetic Algorithms in Search, Optimization and Machine Learning, New York, NY, Addison-Welsey.

    Google Scholar 

  • Goldstein, A. and Price, J. (1971), On descent from local minima, Mathematics of Computation, 25, 569–574.

    Google Scholar 

  • Hansen, E. (1992), Global Optimization using Interval Analysis, New York, M. Dekker.

    Google Scholar 

  • Horst, R. and Tuy, H. (1987), On the convergence of global methods in multiextrimal optimization, Journal of Optimization Theory and Applications, 54, 283.

    Google Scholar 

  • Maranas, C.D. and Floudas, C.A. (1994a), A deterministic global optimization approach for molecular structure determination, Journal of Chemical Physics, 100(2), 1247–1261.

    Google Scholar 

  • Maranas, C.D. and Floudas, C.A. (1994b), Global minimum potential energy conformations for small molecules, Journal of Global Optimization, 4, 135–170.

    Google Scholar 

  • Murty, K.G. and Kabadi S.N. (1987), Some NP-complete problems in quadratic and nonlinear programming, Mathematical Programming, 39, 117–129.

    Google Scholar 

  • Neumaier, A. (1990), Interval Methods for Systems of Equations, Cambridge University Press.

  • Pardalos, P.M. and Schnitger, G. (1988), Checking local optimality in constrained quadratic programming, Operations Research Letters, 7, 33–35.

    Google Scholar 

  • Porn, R., Harjunkoski, I. and Westerlund, T. (1999), Convexification of different classes of nonconvex MINLP problems, Computers and Chemical Engineering, 23, 439–448.

    Google Scholar 

  • Rinnoy-Kan, A.H.G. and Timmer, G.T. (1987a), Stochastic global optimization methods. Part I: Clustering methods, Mathematical Programming, 39, 27–56.

    Google Scholar 

  • Rinnoy-Kan, A.H.G. and Timmer, G.T. (1987b), Stochastic global optimization, Part II: Multi-livel Methods, Mathematical Programming, 39, 57–78.

    Google Scholar 

  • Ryoo, H.S. and Sahinidis, N.V. (1996), A branch-and-reduce approach to global optimization, Journal of Global Optimization, 8(2), 107–139.

    Google Scholar 

  • Schoen, F. (1991), Stochastic techniques for global optimization: A survey of recent advances, Journal of Global Optimization, 1(3), 207–228.

    Google Scholar 

  • Sherali, H.D. and Alameddine, A. (1992), A new reformulation linearization technique for bilinear programming problems, Journal of Global Optimization, 2(4), 379.

    Google Scholar 

  • Smith, E.M.B. and Pantelides C.C. (1996), Global optimization for general process models. In: Grossmann I.E. (ed.), Global Optimization in Engineering Design, Kluwer Academic Publishers, pp. 355–386.

  • Tuy, H. (1987), Global minimum of the difference of two convex functions, Mathematical Programming Study, 30, 150.

    Google Scholar 

  • Wright, M.H. (1992), Interior point methods for constrained optimization, Acta Numerica, 1, 341–407.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Akrotirianakis, I.G., Floudas, C.A. A New Class of Improved Convex Underestimators for Twice Continuously Differentiable Constrained NLPs. J Glob Optim 30, 367–390 (2004). https://doi.org/10.1007/s10898-004-6455-4

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-004-6455-4

Navigation