Abstract
Constraints can render a numerical optimization problem much more difficult to address. In many real-world optimization applications, however, such constraints are not explicitly given. Instead, one has access to some kind of a “black-box” that represents the (unknown) constraint function. Recently, we proposed a fast linear constraint estimator that was based on binary search. This paper extends these results by (a) providing an alternative scheme that resorts to the effective use of support vector machines and by (b) addressing the more general task of non-linear decision boundaries. In particular, we make use of active learning strategies from the field of machine learning to select reasonable training points for the recurrent application of the classifier. We compare both constraint estimation schemes on linear and non-linear constraint functions, and depict opportunities and pitfalls concerning the effective integration of such models into a global optimization process.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arnold, D.V., Hansen, N.: A (1+1)-CMA-ES for constrained optimisation. In: GECCO, pp. 297–304 (2012)
Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: Proceedings of the IEEE Congress on Evolutionary Computation – CEC 2005, vol. 2, pp. 1769–1776 (2005)
Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge Uni. Press (2004)
Freund, Y., Seung, H., Shamir, E., Tishby, N.: Selective sampling using the query by committee algorithm. Machine Learning 28(2-3), 133–168 (1997)
Hansen, N.: The CMA evolution strategy: a comparing review. In: Towards a New Evolutionary Computation. Advances on Estimation of Distribution Algorithms, pp. 75–102. Springer (2006)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning, 2nd edn. Springer (2009)
Jin, Y.: Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm and Evolutionary Computation 1(2), 61–70 (2011)
Kramer, O., Barthelmes, A., Rudolph, G.: Surrogate Constraint Functions for CMA Evolution Strategies. In: Mertsching, B., Hund, M., Aziz, Z. (eds.) KI 2009. LNCS, vol. 5803, pp. 169–176. Springer, Heidelberg (2009)
Ladický, L., Torr, P.H.S.: Locally linear support vector machines. In: Proceedings of the 28th International Conference on Machine Learning, pp. 985–992 (2011)
Runarsson, T.P.: Constrained Evolutionary Optimization by Approximate Ranking and Surrogate Models. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 401–410. Springer, Heidelberg (2004)
Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press (2001)
Settles, B.: Active Learning. Synthesis Lectures on Artificial Intelligence and Machine Learning. Morgan & Claypool Publishers (2012)
Steinwart, I., Christmann, A.: Support Vector Machines. Springer, New York (2008)
Tong, S., Koller, D.: Support vector machine active learning with applications to text classification. Journal of Machine Learning Research 2, 45–66 (2002)
Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Gieseke, F., Kramer, O. (2013). Towards Non-linear Constraint Estimation for Expensive Optimization. In: Esparcia-Alcázar, A.I. (eds) Applications of Evolutionary Computation. EvoApplications 2013. Lecture Notes in Computer Science, vol 7835. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-37192-9_46
Download citation
DOI: https://doi.org/10.1007/978-3-642-37192-9_46
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-37191-2
Online ISBN: 978-3-642-37192-9
eBook Packages: Computer ScienceComputer Science (R0)