skip to main content
10.1145/3583133.3590743acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

Minimizing Black Boxes due to Polynomial-Model-Based Optimization

Published: 24 July 2023 Publication History

Abstract

We introduce a novel black box optimization method, which we term Polynomial-Model-Based Optimization (PMBO). The main idea of this algorithm originates from Bayesian Optimization (BO) [16], however, instead of inferring a stochastic process, a polynomial surrogate of the objective function is fitted and exploited in the regions where the objective is expected to minimize. We compare PMBO with state-of-the-art optimization algorithms for common test functions and a real-world optimization problem. The results demonstrate PMBO to be capable of inferring the inherent structure of the objective function, resulting in its efficiency. This suggests that PMBO might be the pivotal choice for the optimization of highly structured real-world problems.

References

[1]
U. H. Acosta, S. K. T. Veettil, D. Wicaksono, and M. Hecht. 2021. minterpy - Multivariate Interpolation in Python. https://github.com/casus/minterpy (2021).
[2]
J. Bergstra and Y. Bengio. 2012. Random search for hyper-parameter optimization. The Journal of Machine Learning Research 13, 1 (2012), 281--305.
[3]
B. Efron. 1979. Bootstrap methods: another look at the Jackknife. Annals of Statistics 7, 1 (1979), 319--342.
[4]
F.-A. Fortin, F.-M. De Rainville, M. Gardner, M. Parizeau, and C. Gagné. 2012. DEAP: Evolutionary Algorithms Made Easy. Journal of Machine Learning Research 13 (2012), 2171--2175.
[5]
J. D. Gergonne. 1974. The application of the method of least squares to the interpolation of sequences. Historia Mathematica 1, 4 (1974), 439--447.
[6]
N. Hansen, A. Auger, R. Ros, O. Mersmann, T. Tušar, and D. Brockhoff. 2021. COCO: A Platform for Comparing Continuous Optimizers in a Black-Box Setting. Optimization Methods and Software 36 (2021), 114--144. Issue 1.
[7]
N. Hansen, S. D. Müller, and P. Koumoutsakos. 2003. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation 11, 1 (2003), 1--18.
[8]
M. Hecht, B. L. Cheeseman, K. B. Hoffmann, and I. F. Sbalzarini. 2017. A Quadratic-Time Algorithm for General Multivariate Polynomial Interpolation. arXiv preprint arXiv:1710.10846 (2017).
[9]
M. Hecht, K. Gonciarz, J. Michelfeit, V. Sivkin, and I. F Sbalzarini. 2020. Multivariate Interpolation in Unisolvent Nodes-Lifting the Curse of Dimensionality. arXiv preprint arXiv:2010.10824 (2020).
[10]
M. Hecht, K. Gonciarz, J. Michelfeit, V. Sivkin, and I. F Sbalzarini. 2020. Multivariate Interpolation on Unisolvent Nodes-Lifting the Curse of Dimensionality. arXiv preprint arXiv:2010.10824 (2020).
[11]
M. Hecht, K. B. Hoffmann, B. L Cheeseman, and I. F Sbalzarini. 2018. Multivariate Newton Interpolation. arXiv preprint arXiv:1812.04256 (2018).
[12]
M. Hecht and I. F. Sbalzarini. 2018. Fast Interpolation and Fourier Transform in High-Dimensional Spaces. In Intelligent Computing. Proc. 2018 IEEE Computing Conf., Vol. 2, (Advances in Intelligent Systems and Computing, Vol. 857), K. Arai, S. Kapoor, and R. Bhatia (Eds.). Springer Nature, London, UK, 53--75.
[13]
J. H. Holland. 1992. Genetic Algorithms. Scientific American 267, 1 (1992), 66--72.
[14]
M. Jamil and X. Yang. 2013. A literature survey of benchmark functions for global optimisation problems. International Journal of Mathematical Modelling and Numerical Optimisation 4, 150 (2013). Issue 2.
[15]
S. Jlassi, I. Jdey, and H. Ltiff. 2021. Bayesian hyperparameter optimization of deep neural network algorithms based on ant colony optimization. In Document Analysis and Recognition - ICDAR 2021. Springer International Publishing, Cham, 585--594.
[16]
D. Jones, M. Schonlau, and W. J. Welch. 1998. Efficient global optimization of expensive black-box functions. Journal of Global Optimization 7, 1 (1998), 455--492.
[17]
J. Kennedy. 2017. Particle Swarm Optimization. Springer US, Boston, MA, 967--972.
[18]
G. Letort, A. Montagud, G. Stoll, R. Heiland, E. Barillot, P. Macklin, A. Zinovyev, and L. Calzone. 2019. PhysiBoSS: a multi-scale agent-based modelling framework integrating physical dimension and cell signalling. Bioinformatics 35, 7 (2019), 1188--1196.
[19]
E. Meijering. 2002. A chronology of interpolation: from ancient astronomy to modern signal and image processing. Proc. IEEE 90, 3 (2002), 319--342.
[20]
J. Ozik. 2019. Learning-accelerated discovery of immune-tumour interactions. Molecular Systems Design & Engineering (2019).
[21]
J. Ozik, N. Collier, J. M Wozniak, C. Macal, C. Cockrell, S. H Friedman, A. Ghaffarizadeh, R. Heiland, G. An, and P. Macklin. 2018. High-throughput cancer hypothesis testing with an integrated PhysiCell-EMEWS workflow. BMC Bioinformatics 19, 483 (2018). Issue Suppl 18.
[22]
M. Ponce-de Leon, A. Montagud, C. Akasiadis, J. Schreiber, T. Ntiniakou, and A. Valencia. 2022. Optimizing Dosage-Specific Treatments in a Multi-Scale Model of a Tumor Growth. Frontiers in Molecular Biosciences 9, 836794 (2022).
[23]
J. Snoek, H. Larochelle, and R. P. Adams. 2012. Practical Bayesian optimization of machine learning algorithms. Advances in neural information processing systems (2012), 2951--2959.
[24]
S. X. Souza, J. A. K. Suykens, J. Vandewalle, and D. Bolle. 2009. Coupled Simulated Annealing. (2009).
[25]
S. M. Stigler. 1974. Gergonne's 1815 paper on the design and analysis of polynomial regression experiments. Historia Mathematica 1, 4 (1974), 431--439.
[26]
S. K T. Veettil, Y. Zheng, U. H. Acosta, D. Wicaksono, and M. Hecht. 2022. Multivariate Polynomial Regression of Euclidean Degree Extends the Stability for Fast Approximations of Trefethen Functions. arXiv preprint arXiv:2212.11706 (2022).
[27]
S. Wanner. 2021. Hyppopy: A Hyper-Parameter Optimization Toolbox. (2021). https://github.com/MIC-DKFZ/Hyppopy
[28]
D. H Wolpert and W. G Macready. 1997. No free lunch theorems for optimization. IEEE transactions on evolutionary computation 1, 1 (1997), 67--82.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '23 Companion: Proceedings of the Companion Conference on Genetic and Evolutionary Computation
July 2023
2519 pages
ISBN:9798400701207
DOI:10.1145/3583133
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 July 2023

Check for updates

Author Tags

  1. polynomial interpolation
  2. black box optimization
  3. hyper-parameter tuning

Qualifiers

  • Poster

Conference

GECCO '23 Companion
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 58
    Total Downloads
  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)1
Reflects downloads up to 14 Jan 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media