Abstract
Bayesian optimization (BO) has achieved remarkable success in optimizing low-dimensional continuous problems. Recently, BO in high-dimensional discrete solution space is in demand. However, satisfying BO algorithms tailored to this issue still lack. Fortunately, it is observed that high-dimensional discrete optimization problems may exist low-dimensional intrinsic subspace. Inspired by this observation, this paper proposes a Locality Sensitive Hashing based Bayesian Optimization (LSH-BO) method for high-dimensional discrete functions with intrinsic dimension. Via randomly embedding solutions from intrinsic subspace to original space and discretization, LSH-BO turns high-dimensional discrete optimization problems into low-dimensional continuous ones. Theoretically we prove that, with probability 1, there exists a corresponding optimal solution in the intrinsic subspace. The empirically results on both synthetic functions and binary quadratic programming task verify that LSH-BO surpasses the compared methods and possesses the versatility across low-dimensional and high-dimensional kernels.
This work is supported by National Natural Science Foundation of China (No. 62106076), Natural Science Foundation of Shanghai (No. 21ZR1420300), “Chenguang Program” sponsored by Shanghai Education Development Foundation and Shanghai Municipal Education Commission (No. 21CGA32), and National Key Laboratory for Novel Software Technology at Nanjing University (No. KFKT2021B14). Hong Qian is the corresponding author of this paper.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Baptista, R., Poloczek, M.: Bayesian optimization of combinatorial structures. In: Proceedings of the 35th International Conference on Machine Learning, vol. 80, pp. 471–480. Stockholm, Sweden (2018)
Bergstra, J., Bardenet, R., Bengio, Y., Kégl, B.: Algorithms for hyper-parameter optimization. In: Proceedings of the 25th Conference on Neural Information Processing Systems, pp. 2546–2554, Granada, Spain (2011)
Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)
Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)
Brochu, E., Cora, V.M., de Freitas, N.: A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. CoRR abs/1012.2599 (2010)
Charikar, M.: Similarity estimation techniques from rounding algorithms. In: Reif, J.H. (ed.) Proceedings on 34th ACM Symposium on Theory of Computing, pp. 380–388, Montréal, Canada (2002)
Cox, D.D., John, S.: A statistical method for global optimization. In: Proceedings of the 1992 International Conference on Systems, Man, and Cybernetics, pp. 1241–1246, Chicago, Illinois (1992)
Garnett, R.: Bayesian Optimization. Cambridge University Press, Cambridge (2022)
Garrido-Merchán, E.C., Hernández-Lobato, D.: Dealing with categorical and integer-valued variables in Bayesian optimization with gaussian processes. Neurocomputing 380, 20–35 (2020)
Goemans, M.X., Williamson, D.P.: Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming. J. ACM 42(6), 1115–1145 (1995)
Huang, H., Pardalos, P.M., Prokopyev, O.A.: Lower bound improvement and forcing rule for quadratic binary programming. Comput. Optim. Appl. 33(2–3), 187–208 (2006)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Proceedings of the 5th International Conference on Learning and Intelligent Optimization, vol. 6683, pp. 507–523, Rome, Italy (2011)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: An efficient approach for assessing hyperparameter importance. In: Proceedings of the 31th International Conference on Machine Learning, vol. 32, pp. 754–762, Beijing, China (2014)
Kandasamy, K., Schneider, J., Poczos, B.: High dimensional Bayesian optimisation and bandits via additive models. In: Proceedings of the 32nd International Conference on Machine Learning, pp. 295–304, Lille, France (2015)
Kushner, H.J.: A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86(1), 97–106 (1964)
Letham, B., Calandra, R., Rai, A., Bakshy, E.: Re-examining linear embeddings for high-dimensional Bayesian optimization. In: Advances in Neural Information Processing Systems 33. virtual (2020)
Luong, P., Gupta, S., Nguyen, D., Rana, S., Venkatesh, S.: Bayesian optimization with discrete variables. In: Proceedings of the 32nd Australasian Joint Conference, vol. 11919, pp. 473–484, Adelaide, Australia (2019)
Mockus, J., Tiesis, V., Zilinskas, A.: The application of Bayesian methods for seeking the extremum. In: Towards Global Optimization, vol. 2, pp. 117–129. North-Holland (1978)
Nayebi, A., Munteanu, A., Poloczek, M.: A framework for Bayesian optimization in embedded subspaces. In: Proceedings of the 36th International Conference on Machine Learning, vol. 97, pp. 4752–4761, Long Beach, California (2019)
Negoescu, D.M., Frazier, P.I., Powell, W.B.: The knowledge-gradient algorithm for sequencing experiments in drug discovery. INFORMS J. Comput. 23(3), 346–363 (2011)
Oh, C., Tomczak, J.M., Gavves, E., Welling, M.: Combinatorial Bayesian optimization using the graph cartesian product. In: Wallach, H.M., Larochelle, H., Beygelzimer, A., d’Alché-Buc, F., Fox, E.B., Garnett, R. (eds.) Proceedings of the 32th Conference on Neural Information Processing Systems, pp. 2910–2920, Vancouver, Canada (2019)
Qian, H., Hu, Y., Yu, Y.: Derivative-free optimization of high-dimensional non-convex functions by sequential random embeddings. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence, pp. 1946–1952, New York (2016)
Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)
Rolland, P., Scarlett, J., Bogunovic, I., Cevher, V.: High-dimensional Bayesian optimization via additive models with overlapping groups. In: Proceedings of the 21st International Conference on Artificial Intelligence and Statistics, pp. 298–307, Playa Blanca, Spain (2018)
Shahriari, B., Swersky, K., Wang, Z., Adams, R.P., de Freitas, N.: Taking the human out of the loop: a review of Bayesian optimization. Proc. IEEE 104(1), 148–175 (2016)
Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 22(8), 888–905 (2000)
Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Proceedings of the 26th Conference on Neural Information Processing Systems, pp. 2960–2968, Lake Tahoe, Nevada (2012)
Swersky, K., Rubanova, Y., Dohan, D., Murphy, K.: Amortized Bayesian optimization over discrete spaces. In: Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence, vol. 124, pp. 769–778. Virtual (2020)
Wan, X., Nguyen, V., Ha, H., Ru, B.X., Lu, C., Osborne, M.A.: Think global and act local: Bayesian optimisation over high-dimensional categorical and mixed search spaces. In: Proceedings of the 38th International Conference on Machine Learning, vol. 139, pp. 10663–10674. Virtual (2021)
Wang, Z., Hutter, F., Zoghi, M., Matheson, D., de Freitas, N.: Bayesian optimization in a billion dimensions via random embeddings. J. Artif. Intell. Res. 55, 361–387 (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Li, SJ., Li, M., Qian, H. (2022). High-Dimensional Discrete Bayesian Optimization with Intrinsic Dimension. In: Khanna, S., Cao, J., Bai, Q., Xu, G. (eds) PRICAI 2022: Trends in Artificial Intelligence. PRICAI 2022. Lecture Notes in Computer Science, vol 13629. Springer, Cham. https://doi.org/10.1007/978-3-031-20862-1_39
Download citation
DOI: https://doi.org/10.1007/978-3-031-20862-1_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20861-4
Online ISBN: 978-3-031-20862-1
eBook Packages: Computer ScienceComputer Science (R0)