Learning from examples is a key property of autonomous agents. In our contribution, we want to focus on a particular class of strategies which are often referred to as “optimal experimental design“ or “active learning“. Learning machines, which employ these strategies, request examples which are maximal “informative“ for learning a predictor rather than “passively“ scanning their environment. There is a large body of empirical evidence, that active learning is more efficient in terms of the required number of examples. Hence, active learning should be preferred whenever training examples are costly to obtain. In our contribution, we will report new results for active learning methods which we are currently investigating and which are based on the geometrical concept of a version space. We will derive universal hard bounds for the prediction performance using tools from differential geometry, and we will also provide practical algorithms based on kernel methods and Monte-Carlo techniques. The new techniques are applied in psychoacoustical experiments for sound design.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
S. Fine, R. Gilad-Bachrach, and E. Shamir. Learning using query by committee, linear separation and random walks. Theoretical Computer Science, 284:25-51, 2002.
Y. Freund, H. S. Seung, E. Shamir, and N. Tishby. Selective sampling using the query by committee algorithm. Machine Learning, 28(2-3):133-168, 1997.
M. Opper, H. S. Seung, and H. Sompolinsky. Query by committee. Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pages 287-294, Pittsburgh, PA, 1992.
S. Tong and D. Koller. Support vector machine active learning with applications to text clas- sification. Journal of Machine Learning Research, 2:45-66, 2001.
M.-F. Balcan, A. Beygelzimer, and J. Langford. Agnostic active learning. In ICML ’06: Proceedings of the 23rd International Conference on Machine Learning, pages 65-72, 2006. ACM Press, New York.
Francis R. Bach. Active learning for misspecified generalized linear models. In B. Sch ölkopf, J. Platt, and T. Hoffman, editors, Advances in Neural Information Processing Systems 19, pages 65-72. MIT Press, Cambridge, MA, 2007.
T. M. Mitchell. Generalization as search. Artificial Intelligence, 18(2):203-226, 1982.
A. Banerjee, I. S. Dhillon, J. Ghosh, and S. Sra. Clustering on the unit hypersphere using von mises-fisher distributions. Journal of Machine Learning Research, 6:1345-1382, 2005.
R. Herbrich, T. Graepel, C. Campbell, and C.K.I. Williams. Bayes Point Machines. Journal of Machine Learning Research, 1(4):245-278, 2001.
P. Rujan. Playing Billiards in Version Space. Neural Computation, 9(1):99-122, 1997.
G. Wahba. Spline Models for Observational Data, volume 59 of CBMS-NSF Regional Conference Series in Applied Mathematics. SIAM, Philadelphia, 1990.
R. Herbrich. Learning Kernel Classifiers-Theory and Algorithms. Adaptive Computation and Machine Learning. MIT Press, 2002.
F.-F. Henrich and K. Obermayer. Active learning by spherical subdivision. Journal of Machine Learning Research, 9:105-130, 2008.
D. Rochesso and F. Fontana. The Sounding Object. Mondo Estremo, Firenze, Italy, 2003.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer Science + Business Media B.V
About this paper
Cite this paper
Adiloglu, K., Annies, R., Henrich, FF., Paus, A., Obermayer, K. (2008). Geometrical Approaches to Active Learning. In: Mahr, B., Huanye, S. (eds) Autonomous Systems – Self-Organization, Management, and Control. Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-8889-6_2
Download citation
DOI: https://doi.org/10.1007/978-1-4020-8889-6_2
Publisher Name: Springer, Dordrecht
Print ISBN: 978-1-4020-8888-9
Online ISBN: 978-1-4020-8889-6
eBook Packages: EngineeringEngineering (R0)