Skip to main content
Log in

Decomposition in derivative-free optimization

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

This paper proposes a novel decomposition framework for derivative-free optimization (DFO) algorithms. Our framework significantly extends the scope of current DFO solvers to larger-scale problems. We show that the proposed framework closely relates to the superiorization methodology that is traditionally used for improving the efficiency of feasibility-seeking algorithms for constrained optimization problems in a derivative-based setting. We analyze the convergence behavior of the framework in the context of global search algorithms. A practical implementation is developed and exemplified with the global model-based solver Stable Noisy Optimization by Branch and Fit (SNOBFIT) [36]. To investigate the decomposition framework’s performance, we conduct extensive computational studies on a collection of over 300 test problems of varying dimensions and complexity. We observe significant improvements in the quality of solutions for a large fraction of the test problems. Regardless of problem convexity and smoothness, decomposition leads to over 50% improvement in the objective function after 2500 function evaluations for over 90% of our test problems with more than 75 variables.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Abramson, M.A., Audet, C.: Convergence of mesh adaptive direct search to second-order stationary points. SIAM J. Optim. 17, 606–609 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  2. Audet, C., Dennis, J.E., Jr.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17, 188–217 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  3. Audet, C., Dennis, J.E., Jr., Le Digabel, S.: Parallel space decomposition of the mesh adaptive direct search algorithm. SIAM J. Optim. 19, 1150–1170 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bertsekas, D.P., Tsitsiklis, J.N.: Parallel and Distributed Computation: Numerical Methods, vol. 23. Prentice hall Englewood Cliffs, NJ (1989)

    MATH  Google Scholar 

  5. Butnariu, D., Davidi, R., Herman, G.T., Kazantsev, I.G.: Stable convergence behavior under summable perturbations of a class of projection methods for convex feasibility and optimization problems. IEEE J. Sel. Top. Signal Process. 1(4), 540–547 (2007)

    Article  Google Scholar 

  6. Censor, Y.: Weak and strong superiorization: between feasibility-seeking and minimization. Analele Universitatii “Ovidius’’ Constanta-Seria Matematica 23(3), 41–54 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  7. Censor, Y., Garduño, E., Helou, E.S., Herman, G.T.: Derivative-free superiorization: Principle and algorithm. arXiv preprint arXiv:1908.10100 (2019)

  8. Censor, Y., Heaton, H., Schulte, R.: Derivative-free superiorization with component-wise perturbations. Numer. Algorithms 80(4), 1219–1240 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  9. Censor, Y., Zaslavski, A.J.: Convergence and perturbation resilience of dynamic string-averaging projection methods. Comput. Optim. Appl. 54(1), 65–76 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  10. Censor, Y., Zaslavski, A.J.: Strict Fejér monotonicity by superiorization of feasibility-seeking projection methods. J. Optim. Theory Appl. 165(1), 172–187 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  11. Conn, A.R., Gould, N., Lescrenier, M., Toint, P.L.: Performance of a multifrontal scheme for partially separable optimization. In: Gomez, S., Hennart, J.-P. (eds.) Advances in Optimization and Numerical Analysis, pp. 79–96. Kluwer Academic Publishers, Dordrecht (1994)

    Chapter  MATH  Google Scholar 

  12. Conn, A.R., Scheinberg, K., Toint, P.L.: On the convergence of derivative-free methods for unconstrained optimization. In: Buhmann, M.D., Iserles, A. (eds.) Approximation Theory and Optimization, Tribute to M. J. D. Powell, pp. 83–108. Cambridge University Press, Cambridge (1996)

    Google Scholar 

  13. Conn, A.R., Scheinberg, K., Vicente, L.N.: Global convergence of general derivative-free trust-region algorithms to first and second order critical points. SIAM J. Optim. 20, 387–415 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  14. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-free Optimization. SIAM, Philadelphia (2009)

    Book  MATH  Google Scholar 

  15. Custódio, A., Scheinberg, K., Vicente, L.N.: Methodologies and software for derivative-free optimization. In: SIAM Journal on Advances and Trends in Optimization with Engineering Applications, pp. 495–506 (2017)

  16. Davidi, R., Herman, G.T., Censor, Y.: Perturbation-resilient block-iterative projection methods with application to image reconstruction from projections. Int. Trans. Oper. Res. 16(4), 505–524 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  17. Dennis, J.E., Jr., Torczon, V.J.: Direct search methods on parallel machines. SIAM J. Optim. 1, 448–474 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  18. Ferris, M.C., Mangasarian, O.L.: Parallel variable distribution. SIAM J. Optim. 4(4), 815–832 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  19. Frommer, A., Renaut, R.A.: A unified approach to parallel space decomposition methods. J. Comput. Appl. Math. 110(1), 205–223 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  20. Fukushima, M.: Parallel variable transformation in unconstrained optimization. SIAM J. Optim. 8(3), 658–672 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  21. García-Palomares, U.M., García-Urrea, I.J., Rodríguez-Hernández, P.S.: On sequential and parallel non-monotone derivative-free algorithms for box constrained optimization. Optim. Methods Softw. 28(6), 1233–1261 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  22. García-Palomares, U.M., Rodríguez, J.F.: New sequential and parallel derivative-free algorithms for unconstrained minimization. SIAM J. Optim. 13(1), 79–96 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  23. Garduño, E., Herman, G.T.: Superiorization of the ML-EM algorithm. IEEE Trans. Nucl. Sci. 61(1), 162–172 (2013)

    Article  Google Scholar 

  24. Gilmore, P., Kelley, C.T.: An implicit filtering algorithm for optimization of functions with many local minima. SIAM J. Optim. 5, 269–285 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  25. GLOBAL Library. http://www.gamsworld.org/global/globallib.htm

  26. Gray, G., Kolda, T., Sale, K., Young, M.: Optimizing an empirical scoring function for transmembrane protein structure determination. INFORMS J. Comput. 16, 406–418 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  27. Gray, G.A., Kolda, T.G.: Algorithm 856: APPSPACK 4.0: parallel pattern search for derivative-free optimization. ACM Trans. Math. Softw. 32, 485–507 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  28. Hare, W., Nutini, J., Tesfamariam, S.: A survey of non-gradient optimization methods in structural engineering. Adv. Eng. Softw. 59, 19–28 (2013)

    Article  Google Scholar 

  29. Hayes, R.E., Bertrand, F.H., Audet, C., Kolaczkowski, S.T.: Catalytic combustion kinetics: using a direct search algorithm to evaluate kinetic parameters from light-off curves. Can. J. Chem. Eng. 81, 1192–1199 (2003)

    Article  Google Scholar 

  30. Herman, G.T., Garduño, E., Davidi, R., Censor, Y.: Superiorization: an optimization heuristic for medical physics. Med. Phys. 39(9), 5532–5546 (2012)

    Article  Google Scholar 

  31. Holland, J.H.: Adaptation in Natural and Artificial Systems. The University of Michigan Press, New York (1975)

    Google Scholar 

  32. Hosseinabadi, A.A.R., Vahidi, J., Saemi, B., Sangaiah, A.K., Elhoseny, M.: Extended genetic algorithm for solving open-shop scheduling problem. Soft. Comput. 23(13), 5099–5116 (2019)

    Article  Google Scholar 

  33. Hough, P.D., Kolda, T.G., Torczon, V.J.: Asynchronous parallel pattern search for nonlinear optimization. SIAM J. Sci. Comput. 23, 134–156 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  34. Hu, X., He, F., Chen, W., Zhang, J.: Cooperation coevolution with fast interdependency identification for large scale optimization. Inf. Sci. 381, 142–160 (2017)

    Article  Google Scholar 

  35. Huyer, W., Neumaier, A.: Global optimization by multilevel coordinate search. J. Global Optim. 14, 331–355 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  36. Huyer, W., Neumaier, A.: SNOBFIT-Stable noisy optimization by branch and fit. ACM Trans. Math. Softw. 35, 1–25 (2008)

    Article  MathSciNet  Google Scholar 

  37. Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the Lipschitz constant. J. Optim. Theory Appl. 79, 157–181 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  38. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948. Piscataway, NJ, USA (1995)

  39. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220, 671–680 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  40. Kozachenko, L.F., Leonenko, N.N.: Sample estimate of the entropy of a random vector. Problems Inf. Transm. 23(2), 95–101 (1987)

    MATH  Google Scholar 

  41. Liu, C., Tseng, C.: Space-decomposition minimization method for large-scale minimization problems. Comput. Math. Appl. 37(7), 73–88 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  42. Liu, C., Tseng, C.: Parallel synchronous and asynchronous space-decomposition algorithms for large-scale minimization problems. Comput. Optim. Appl. 17(1), 85–107 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  43. Lukšan, L., Vlček, J.: Test problems for nonsmooth unconstrained and linearly constrained optimization. Technical report, Institute of Computer Science, Academy of Sciences of the Czech Republic (2000). http://www3.cs.cas.cz/ics/reports/v798-00.ps

  44. Mangasarian, O.L.: Parallel gradient distribution in unconstrained optimization. SIAM J. Control Optim. 33(6), 1916–1925 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  45. McKay, M.D., Beckman, R.J., Conover, W.J.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21, 239–245 (1979)

    MathSciNet  MATH  Google Scholar 

  46. Mei, Y., Omidvar, M.N., Li, X., Yao, X.: A competitive divide-and-conquer algorithm for unconstrained large-scale black-box optimization. ACM Trans. Math. Softw. 42(2), 13 (2016)

    Article  MathSciNet  Google Scholar 

  47. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7, 308–313 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  48. Nesterov, Y.: Gradient methods for minimizing composite objective function. Mathematical Programming, Series B 140, 125–161 (2013)

    Article  MATH  Google Scholar 

  49. Omidvar, M.N., Yang, M., Mei, Y., Li, X., Yao, X.: Dg2: a faster and more accurate differential grouping for large-scale black-box optimization. IEEE Trans. Evol. Comput. 21(6), 929–942 (2017)

    Article  Google Scholar 

  50. Powell, M.J.D.: The NEWUOA software for unconstrained optimization without derivatives. In: Di Pillo, G., Roma, M. (eds.) Large-Scale Nonlinear Optimization, pp. 255–297. Springer, New York (2006)

    Chapter  Google Scholar 

  51. Princeton Library. http://www.gamsworld.org/performance/princetonlib/princetonlib.htm

  52. Richtarik, P.: Improved algorithms for convex minimization in relative scale. SIAM J. Optim. 21, 1141–1167 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  53. Rios, L.M., Sahinidis, N.V.: Derivative-free optimization: a review of algorithms and comparison of software implementations. J. Global Optim. 56, 1247–1293 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  54. Sun, Y., Kirley, M., Halgamuge, S.K.: Extended differential grouping for large scale global optimization with direct and indirect variable interactions. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 313–320. ACM (2015)

  55. Torczon, V.J.: On the convergence of multidirectional search algorithms. SIAM J. Optim. 1, 123–145 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  56. Torczon, V.J.: On the convergence of pattern search algorithms. SIAM J. Optim. 7, 1–25 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  57. Yamakawa, E., Fukushima, M.: Testing parallel variable transformation. Comput. Optim. Appl. 13(1–3), 253–274 (1999)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was funded by Dow’s University Partnership Initiative.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nikolaos V. Sahinidis.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ma, K., Sahinidis, N.V., Rajagopalan, S. et al. Decomposition in derivative-free optimization. J Glob Optim 81, 269–292 (2021). https://doi.org/10.1007/s10898-021-01051-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-021-01051-w

Keywords

Navigation