Abstract
This paper discusses theoretical limitations of classification systems that are based on feature maps and use a separating hyper-plane in the feature space. In particular, we study the embeddability of a given concept class into a class of Euclidean half spaces of low dimension, or of arbitrarily large dimension but realizing a large margin. New bounds on the smallest possible dimension or on the largest possible margin are presented. In addition, we present new results on the rigidity of matrices and briefly mention applications in complexity and learning theory.
This work has been supported in part by the ESPRIT Working Group in Neural, Computational Learning II, NeuroCOLT2, No. 27150 and by the Deutsche Forschungsgemeinschaft Grant SI 498/4-1.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Rosa I. Arriaga and Santosh Vempala. An algorithmic theory of learning: Robust concepts and random projection. In Proceedings of the 40’th Annual Symposium on the Foundations of Computer Science, pages 616–623, 1999.
Shai Ben-David. Personal Communication.
Shai Ben-David, Nadav Eiron, and Hans Ulrich Simon. Limitations of learning via embeddings in Euclidean half spaces. In Proceedings of the 14th Annual Workshop on Computational Learning Theory, pages 385–401. Springer, 2001.
Bernhard E. Boser, Isabelle M. Guyon, and Vladimir N. Vapnik. A training algorithm for optimal margin classifiers. In Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory, pages 144–152. ACM Press, 1992.
Nello Christianini and John Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge University Press, 2000.
Jürgen Forster. A linear bound on the unbounded error probabilistic communication complexity. In Proceedings of the 16th Annual Conference on Computational Complexity, pages 100–106, 2001.
Jürgen Forster, Matthias Krause, Satyanarayana V. Lokam, Rustam Mubarakzjanov, Niels Schmitt, and Hans U. Simon. Relations between communication complexity, linear arrangements, and computational complexity. In Proceedings of the 21’st Annual Conference on the Foundations of Software Technology and Theoretical Computer Science, pages 171–182, 2001.
Jürgen Forster, Niels Schmitt, and Hans Ulrich Simon. Estimating the optimal margins of embeddings in Euclidean half spaces. In Proceedings of the 14th Annual Workshop on Computational Learning Theory, pages 402–415. Springer, 2001.
P. Frankl and H. Maehara. The Johnson-Lindenstrauss lemma and the sphericity of some graphs. Journal of Combinatorial Theory (B), 44:355–362, 1988.
Gene H. Golub and Charles F. Van Loan. Matrix Computations. The John Hopkins University Press, 1991.
A. J. Hoffman and H. W. Wielandt. The variation of the spectrum of a normal matrix. Duke Mathematical Journal, 20:37–39, 1953.
Roger A. Horn and Charles R. Johnson. Matrix Analysis. Cambridge University Press, 1985.
Roger A. Horn and Charles R. Johnson. Topics in Matrix Analysis. Cambridge University Press, 1991.
W. B. Johnson and J. Lindenstrauss. Extensions of Lipshitz mapping into Hilbert spaces. Contemp. Math., 26:189–206, 1984.
B. S. Kashin and A. A. Razborov. Improved lower bounds on the rigidity of Hadamard matrices. Mathematical Notes, 63(4):471–475, 1998.
Satyanarayana V. Lokam. Spectral methods for matrix rigidity with applications to size-depth tradeoffs and communication complexity. In Proceedings of the 36th Symposium on Foundations of Computer Science, pages 6–15, 1995.
Vladimir Vapnik. Statistical Learning Theory. Wiley Series on Adaptive and Learning Systems for Signal Processing, Communications, and Control. John Wiley & Sons, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Forster, J., Simon, H.U. (2002). On the Smallest Possible Dimension and the Largest Possible Margin of Linear Arrangements Representing Given Concept Classes Uniform Distribution. In: Cesa-Bianchi, N., Numao, M., Reischuk, R. (eds) Algorithmic Learning Theory. ALT 2002. Lecture Notes in Computer Science(), vol 2533. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36169-3_12
Download citation
DOI: https://doi.org/10.1007/3-540-36169-3_12
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-00170-6
Online ISBN: 978-3-540-36169-5
eBook Packages: Springer Book Archive