Abstract
Univariate box-constrained global optimization problems are considered, where the objective function is supposed to be Lipschitz continuous and multiextremal. It is assumed that its analytical representation is unknown (the function is given as a “black-box”) and even one its evaluation is a computationally expensive procedure. Geometric and information statistical frameworks for construction of global optimization algorithms are discussed. Several powerful acceleration techniques are described and a number of methods of both classes is constructed by mixing the introduced acceleration ideas. Numerical experiments executed on broad test classes taken from the literature show advantages of the presented techniques with respect to their direct competitors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Barkalov, K.A., Gergel, V.P.: Parallel global optimization on GPU. J. Glob. Optim. 66(1), 3–20 (2016)
Calvin, J.M., Žilinskas, A.: One-dimensional global optimization for observations with noise. Comput. Math. Appl. 50(1–2), 157–169 (2005)
Daponte, P., Grimaldi, D., Molinaro, A., Sergeyev, Y.D.: Fast detection of the first zero-crossing in a measurement signal set. Measurement 19(1), 29–39 (1996)
Floudas, C.A., Pardalos, P.M.: State of the Art in Global Optimization. Kluwer Academic Publishers, Dordrecht (1996)
Gergel, V.P., Grishagin, V.A., Israfilov, R.A.: Local tuning in nested scheme of global optimization. Procedia Comput. Sci. 51, 865–874 (2015)
Grishagin, V.A., Israfilov, R.A., Sergeyev, Y.D.: Comparative efficiency of dimensionality reduction schemes in global optimization. In: Proceedings of the 2nd International Conference on “Numerical Computations: Theory and Algorithms”, vol. 1776, p. 060011. AIP Publishing, New York (2016). https://doi.org/10.1063/1.4965345
Grishagin, V.A., Israfilov, R.A., Sergeyev, Y.D.: Convergence conditions and numerical comparison of global optimization methods based on dimensionality reduction schemes. Appl. Math. Comput. 318, 270–280 (2018)
Hansen, P., Jaumard, B., Lu, S.H.: Global optimization of univariate Lipschitz functions: II. New algorithms and computational comparison. Math. Program. 55(1–3), 273–292 (1992)
Kvasov, D.E., Pizzuti, C., Sergeyev, Y.D.: Local tuning and partition strategies for diagonal GO methods. Numer. Math. 94(1), 93–106 (2003)
Kvasov, D.E., Sergeyev, Y.D.: Deterministic approaches for solving practical black-box global optimization problems. Adv. Eng. Softw. 80, 58–66 (2015)
Lera, D., Sergeyev, Y.D.: An information global minimization algorithm using the local improvement technique. J. Glob. Optim. 48(1), 99–112 (2010)
Lera, D., Sergeyev, Y.D.: Acceleration of univariate global optimization algorithms working with Lipschitz functions and Lipschitz first derivatives. SIAM J. Optim. 1(23), 508–529 (2013)
Modorskii, V.Y., Gaynutdinova, D.F., Gergel, V.P., Barkalov, K.A.: Optimization in design of scientific products for purposes of cavitation problems. In: Proceedings of the International Conference of Numerical Analysis and Applied Mathematics (ICNAAM 2015), vol. 1738, p. 400013. AIP Publishing, New York (2016). https://doi.org/10.1063/1.4952201
Paulavičius, R., Sergeyev, Y.D., Kvasov, D.E., Žilinskas, J.: Globally-biased DISIMPL algorithm for expensive global optimization. J. Glob. Optim. 59(2–3), 545–567 (2014)
Pintér, J.D.: Global Optimization in Action (Continuous and Lipschitz Optimization: Algorithms, Implementations and Applications). Kluwer, Dordrecht (1996)
Piyavskij, S.A.: An algorithm for finding the absolute extremum of a function. USSR Comput. Math. Math. Phys. 12(4), 57–67 (1972)
Sergeyev, Y.D.: An information global optimization algorithm with local tuning. SIAM J. Optim. 5(4), 858–870 (1995)
Sergeyev, Y.D.: A one-dimensional deterministic global minimization algorithm. Comput. Math. Math. Phys. 35(5), 705–717 (1995)
Sergeyev, Y.D.: Global one-dimensional optimization using smooth auxiliary functions. Math. Program. 81(1), 127–146 (1998)
Sergeyev, Y.D., Daponte, P., Grimaldi, D., Molinaro, A.: Two methods for solving optimization problems arising in electronic measurements and electrical engineering. SIAM J. Optim. 10(1), 1–21 (1999)
Sergeyev, Y.D., Grishagin, V.A.: A parallel method for finding the global minimum of univariate functions. J. Optimiz. Theor. Appl. 80(3), 513–536 (1994)
Sergeyev, Y.D., Kvasov, D.E.: Deterministic Global Optimization: An Introduction to the Diagonal Approach. Springer, New York (2017)
Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales. Commun. Nonlinear Sci. 59, 319–330 (2018)
Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Nat. Sci. Rep. 8, Article 453 (2018). https://doi.org/10.1038/s41598-017-18940-4
Sergeyev, Y.D., Mukhametzhanov, M.S., Kvasov, D.E., Lera, D.: Derivative-free local tuning and local improvement techniques embedded in the univariate global optimization. J. Optimiz. Theor. Appl. 171(1), 319–330 (2016)
Sergeyev, Y.D., Strongin, R.G., Lera, D.: Introduction to Global Optimization Exploiting Space-Filling Curves. Springer, New York (2013)
Strongin, R.G.: On the convergence of an algorithm for finding a global extremum. Eng. Cybern. 11, 549–555 (1973)
Strongin, R.G., Sergeyev, Y.D.: Global Optimization with Non-convex Constraints: Sequential and Parallel Algorithms. Kluwer Academic Publishers, Dordrecht (2000)
Zhigljavsky, A., Žilinskas, A.: Stochastic Global Optimization. Springer, New York (2008)
Acknowledgements
The work of M.S. Mukhametzhanov was supported by the INdAM-GNCS funding “Giovani Ricercatori 2018–2019”.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Kvasov, D.E., Mukhametzhanov, M.S., Nasso, M.C., Sergeyev, Y.D. (2020). On Acceleration of Derivative-Free Univariate Lipschitz Global Optimization Methods. In: Sergeyev, Y., Kvasov, D. (eds) Numerical Computations: Theory and Algorithms. NUMTA 2019. Lecture Notes in Computer Science(), vol 11974. Springer, Cham. https://doi.org/10.1007/978-3-030-40616-5_38
Download citation
DOI: https://doi.org/10.1007/978-3-030-40616-5_38
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-40615-8
Online ISBN: 978-3-030-40616-5
eBook Packages: Computer ScienceComputer Science (R0)