Abstract
Many advanced solving algorithms for constraint programming problems are highly configurable. The research area of algorithm configuration investigates ways of automatically configuring these solvers in the best manner possible. In this paper, we specifically focus on algorithm configuration in which the objective is to decrease the time it takes the solver to find an optimal solution. In this setting, adaptive capping is a popular technique which reduces the overall runtime of the search for good configurations by adaptively setting the solver’s timeout to the best runtime found so far. Additionally, sequential model-based optimization (SMBO)—in which one iteratively learns a surrogate model that can predict the runtime of unseen configurations—has proven to be a successful paradigm. Unfortunately, adaptive capping and SMBO have thus far remained incompatible, as in adaptive capping, one cannot observe the true runtime of runs that time out, precluding the typical use of SMBO. To marry adaptive capping and SMBO, we instead use SMBO to model the probability that a configuration will improve on the best runtime achieved so far, for which we propose several decomposed models. These models also allow defining prior probabilities for each hyperparameter. The experimental results show that our DeCaprio method speeds up hyperparameter search compared to random search and the seminal adaptive capping approach of ParamILS.
Ignace Bleukx and Senne Berden—These authors contributed equally.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
DEcomposable adaptive CApping with PRIOrs.
- 2.
From Håkan Kjellerstrand’s collection: http://www.hakank.org/cpmpy/.
- 3.
The number of threads was limited to 1 for every solver call.
- 4.
References
Anastacio, M., Hoos, H.: Model-based algorithm configuration with default-guided probabilistic sampling. In: Bäck, T. (ed.) PPSN 2020. LNCS, vol. 12269, pp. 95–110. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58112-1_7
Bergstra, J., Bengio, Y.: Algorithms for hyper-parameter optimization. In: In NIPS, pp. 2546–2554 (2011)
Cáceres, L.P., López-Ibáñez, M., Hoos, H., Stützle, T.: An experimental study of adaptive capping in irace. In: Battiti, R., Kvasov, D.E., Sergeyev, Y.D. (eds.) LION 2017. LNCS, vol. 10556, pp. 235–250. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-69404-7_17
De Souza, M., Ritt, M., López-Ibáñez, M.: Capping methods for the automatic configuration of optimization algorithms. Comput. Oper. Res. 139, 105615 (2021)
Fichte, J.K., Hecher, M., McCreesh, C., Shahab, A.: Complications for computational experiments from modern processors. In: 27th International Conference on Principles and Practice of Constraint Programming (CP 2021). Schloss Dagstuhl-Leibniz-Zentrum für Informatik (2021)
Guns, T.: Increasing modeling language convenience with a universal n-dimensional array, cppy as python-embedded example. In: Proceedings of the 18th workshop on Constraint Modelling and Reformulation, Held with CP, vol. 19 (2019)
Hutter, F., Hoos, H., Leyton-Brown, K., Stützle, T.: Paramils: an automatic algorithm configuration framework. J. Artif. Intell. Res. (JAIR) 36, 267–306 (2009)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Automated configuration of mixed integer programming solvers. In: Lodi, A., Milano, M., Toth, P. (eds.) CPAIOR 2010. LNCS, vol. 6140, pp. 186–202. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-13520-0_23
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) Learning and Intelligent Optimization, pp. 507–523. Springer, Berlin Heidelberg, Berlin, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40
Kerschke, P., Hoos, H.H., Neumann, F., Trautmann, H.: Automated algorithm selection: survey and perspectives. Evol. Comput. 27(1), 3–45 (2019). https://doi.org/10.1162/evco_a_00242
López-Ibáñez, M., Dubois-Lacoste, J., Cáceres, L.P., Birattari, M., Stützle, T.: The irace package: iterated racing for automatic algorithm configuration. Oper. Res. Perspect. 3, 43–58 (2016)
Perron, L., Furnon, V.: Or-tools. https://developers.google.com/optimization/
Yogatama, D., Mann, G.: Efficient transfer learning method for automatic hyperparameter tuning. In: Artificial Intelligence and Statistics, pp. 1077–1085. PMLR (2014)
Acknowledgments
This research was partly funded by the Flemish Government (AI Research Program), the Research Foundation - Flanders (FWO) projects G0G3220N and S007318N and the European Research Council (ERC) under the EU Horizon 2020 research and innovation programme (Grant No 101002802, CHAT-Opt).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
A Adapted SMBO
A Adapted SMBO

Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Bleukx, I., Berden, S., Coenen, L., Decleyre, N., Guns, T. (2022). Model-Based Algorithm Configuration with Adaptive Capping and Prior Distributions. In: Schaus, P. (eds) Integration of Constraint Programming, Artificial Intelligence, and Operations Research. CPAIOR 2022. Lecture Notes in Computer Science, vol 13292. Springer, Cham. https://doi.org/10.1007/978-3-031-08011-1_6
Download citation
DOI: https://doi.org/10.1007/978-3-031-08011-1_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-08010-4
Online ISBN: 978-3-031-08011-1
eBook Packages: Computer ScienceComputer Science (R0)