Skip to main content
Log in

A New Algorithm for Box-Constrained Global Optimization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

An important class of deterministic methods for global optimization is based on the theory of terminal attractors and repellers. Unfortunately, the utilization of scalar repellers is unsuitable, when the dimension n of the problem assumes values of operational interest. In previous papers the author et al. showed that BFSG-type methods, approximating the Hessian of twice continuously differentiable functions with a structured matrix, are very efficient to compute local minima, particularly in the secant case. On the other hand, the algorithms founded on the classical αBB technique are often ineffective for computational reasons. In order to increase the power of repellers in the tunneling phases, the utilization of repeller matrices with a proper structure is certainly promising and deserves investigation. In this work, it is shown that a BFGS-type method of low complexity, implemented in the local optimizations, can be effectively matched with proper repeller matrices in the tunneling phases. The novel algorithm FBαBB, which can be applied in the frame of the αBB computational scheme, is very efficient in terms of Number of Functions Generations (NFG), Success Rates (SR) in the evaluation of the global minimum and Number of Local Searches (NLS).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Levy, A.V., Montalvo, A.: The tunneling algorithm for the global minimization of functions. SIAM J. Sci. Stat. Comput. 6, 15–29 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  2. Yao, Y.: Dynamical tunneling algorithm for global optimization. IEEE Trans. Syst. Man Cybern. 19, 1222–1230 (1989)

    Article  Google Scholar 

  3. Cetin, B.C., Barhen, J., Burdick, J.W.: Terminal repeller unconstrained subenergy tunneling for fast global optimization. J. Optim. Theory Appl. 77, 97–126 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  4. Barhen, J., Protopopescu, V., Reister, D.: TRUST: A deterministic algorithm for global optimization. Science 276, 1094–1097 (1997)

    Article  MathSciNet  Google Scholar 

  5. Barhen, J., Burdick, J.W., Cetin, B.C.: Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks. In: ICNN93, vol. 2, pp. 836–842 (1993)

    Google Scholar 

  6. Di Fiore, C., Fanelli, S., Zellini, P.: Computational experiences of a novel global algorithm for optimal learning in MLP-networks. In: ICONIP’02, vol. 1, pp. 317–321 (2002)

    Google Scholar 

  7. Jiang, H., Cai, W., Shao, X.: A random tunneling algorithm for the structural optimization problem. Phys. Chem. Chem. Phys. 4, 4782–4788 (2002)

    Article  Google Scholar 

  8. Srinivasa, M., Rangaiah, G.P.: Implementation and evaluation of random tunneling algorithm for chemical engineering applications. Comput. Chem. Eng. 30(9), 1400–1415 (2006)

    Article  Google Scholar 

  9. Di Fiore, C., Fanelli, S., Lepore, F., Zellini, P.: Matrix algebras in quasi-Newton methods for unconstrained optimization. Numer. Math. 94, 479–500 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  10. Bortoletti, A., Di Fiore, C., Fanelli, S., Zellini, P.: A new class of quasi-Newtonian methods for optimal learning in MLP-networks. IEEE Trans. Neural Netw. 14, 263–273 (2003)

    Article  Google Scholar 

  11. Di Fiore, C., Fanelli, S., Zellini, P.: An efficient generalization of Battiti-Shanno’s quasi-Newton algorithm for optimal learning in MLP-networks. In: Neural Information Processing, vol. 1, pp. 483–488. Springer, Calcuta (2004)

    Chapter  Google Scholar 

  12. Di Fiore, C., Fanelli, S., Zellini, P.: Low complexity minimization algorithms. Numer. Linear Algebra Appl., 12, 755–768 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  13. Di Fiore, C., Fanelli, S., Zellini, P.: Low complexity secant quasi-Newton minimization algorithms for nonconvex functions. J. Comput. Appl. Math., 210, 167–174 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  14. Floudas, C.A., Visweswaran, V.: A primal relaxed dual global optimization approach. J. Optim. Theory Appl. 78(2), 187–225 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  15. Floudas, C.A.: Deterministic Global Optimization. Dordrecht, Kluwer (2000)

    Google Scholar 

  16. Meyer, C.A., Floudas, C.A.: Convex underestimation of twice continuously differentiable function by piecewise quadratic perturbation: spline αBB underestimators. J. Glob. Optim. 232(2), 221–258 (2005)

    Article  MathSciNet  Google Scholar 

  17. Gounaris, C.E., Floudas, C.A.: Tight convex underestimators for \(\mathcal{C}^{2}\)-continuous problems: I univariate functions. J. Glob. Optim. 42(1), 51–67 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  18. Gounaris, C.E., Floudas, C.A.: Tight convex underestimators for \(\mathcal{C}^{2}\)-continuous problems: II multivariate functions. J. Glob. Optim. 42(1), 69–89 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  19. Gounaris, C.E., Floudas, C.A.: Convexity of products of univariate functions and convexification transformations for geometric programming. J. Optim. Theory Appl. 138(3), 407–427 (2008)

    Article  MathSciNet  Google Scholar 

  20. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, Berlin (1999)

    Book  MATH  Google Scholar 

  21. Powell, M.J.D.: Some global convergence properties of a variable metric algorithm for minimization without exact line search. In: Nonlinear Programming. SIAM-AMS Proc., vol. 9, pp. 53–72 (1976)

    Google Scholar 

  22. Oseledets, I., Tyrtyshnikov, E.: A unifying approach to the construction of circulant preconditioners. Linear Algebra Appl., 418, 435–449 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  23. Wolfe, P.: Convergence conditions for descent methods. SIAM Rev. 11, 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  24. Tyrtyshnikov, E.: Private communication

  25. Cai, W., Shao, X.: A Fast annealing evolutionary algorithm for global optimization. J. Comput. Chem. 23, 427–435 (2002)

    Article  Google Scholar 

  26. Yiu, K.F.C., Liu, Y., Teo, K.L.: A hybrid descent method for global optimization. J. Glob. Optim. 28, 229–238 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  27. Liang, J.J., Qin, A.K.: Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput., 10(3), 281–295 (2006)

    Article  Google Scholar 

  28. Whitley, D., Mathias, K., Rana, S., Dzubera, J.: Evaluating evolutionary algorithms. Artif. Intell. 85, 245–276 (1996)

    Article  Google Scholar 

  29. Musrrat, A., Millie, P., Ajith, A.: Simplex differential evolution. Acta Polytech. Hung., 6, 95–115 (2009)

    Google Scholar 

  30. Storn, R., Price, K.: DE-a simple and efficient heuristics for global optimization over continuous space. J. Glob. Optim. 11(4), 41–359 (1997)

    Article  MathSciNet  Google Scholar 

  31. Rahnamayan, S., Tizhoosh, H.R., Salman, M.A.: Opposition-based differential evolution. IEEE Trans. Evol. Comput. 12(1), 64–79 (2008)

    Article  Google Scholar 

  32. Akrotirianakis, I.G., Floudas, C.A.: A new class of improved convex underestimators for twice continuously differentiable constrained NLP’s. J. Glob. Optim., 30(4), 367–390 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  33. Floudas, C.A., Gounaris, C.E.: A review of recent advances in global optimization. J. Glob. Optim. 45, 3–38 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  34. Akrotirianakis, I.G., Floudas, C.A.: Computational experience with a new class of convex underestimators: box-constrained NLP problems. J. Glob. Optim. 29(3), 249–264 (2004)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. Fanelli.

Additional information

Communicated by F. Zirilli.

The author wishes to thank the students Valeria Tozzi and Francesco Tudisco for their kind support in the implementation of Algorithm FBαBB. The author is also indebted to an anonymous referee for some useful suggestions.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fanelli, S. A New Algorithm for Box-Constrained Global Optimization. J Optim Theory Appl 149, 175–196 (2011). https://doi.org/10.1007/s10957-010-9780-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-010-9780-4

Keywords

Navigation