Abstract
This paper presents a computation of the V γ dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression ε-insensitive loss function L ε, and general L p loss functions. Finiteness of the V γ dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the L ε or general L p loss functions. This paper presents a novel proof of this result. It also presents a computation of an upper bound of the V γ dimension under some conditions, that leads to an approach for the estimation of the empirical V γ dimension given a set of training data.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
N. Alon, S. Ben-David, N. Cesa-Bianchi, and D. Haussler. Scale-sensitive dimensions, uniform convergence, and learnability. J. of the ACM, 44(4):615–631, 1997.
N. Aronszajn. Theory of reproducing kernels. Trans. Amer. Math. Soc., 686:337–404, 1950.
P. Bartlett and J. Shawe-Taylor. Generalization performance of support vector machine and other pattern classifiers. In C. Burges B. Scholkopf, editor, Advances in Kernel Methods-Support Vector Learning. MIT press, 1998.
L. Devroye, L. Györfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. Number 31 in Applications of mathematics. Springer, New York, 1996.
T. Evgeniou, M. Pontil, and T. Poggio. A unified framework for regularization networks and support vector machines. A.I. Memo No. 1654, Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 1999.
F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural networks architectures. Neural Computation, 7:219–269, 1995.
L. Gurvits. A note on scale-sensitive dimension of linear bounded functionals in Banach spaces. In Proceedings of Algorithm Learning Theory, 1997.
M. Kearns and R.E. Shapire. Efficient distribution-free learning of probabilistic concepts. Journal of Computer and Systems Sciences, 48(3):464–497, 1994.
M.J.D. Powell. The theory of radial basis functions approximation in 1990. In W.A. Light, editor, Advances in Numerical Analysis Volume II: Wavelets, Subdivision Algorithms and Radial Basis Functions, pages 105–210. Oxford University Press, 1992.
A. N. Tikhonov and V. Y. Arsenin. Solutions of Ill-posed Problems.W. H. Winston, Washington, D.C., 1977.
V. N. Vapnik. Statistical Learning Theory. Wiley, New York, 1998.
G. Wahba. Splines Models for Observational Data. Series in Applied Mathematics, Vol. 59, SIAM, Philadelphia, 1990.
R. Williamson, A. Smola, and B. Scholkopf. Generalization performance of regularization networks and support vector machines via entropy numbers. Technical Report NC-TR-98-019, Royal Holloway College University of London, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Evgeniou, T., Pontil, M. (1999). On the V γ Dimension for Regression in Reproducing Kernel Hilbert Spaces. In: Watanabe, O., Yokomori, T. (eds) Algorithmic Learning Theory. ALT 1999. Lecture Notes in Computer Science(), vol 1720. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46769-6_9
Download citation
DOI: https://doi.org/10.1007/3-540-46769-6_9
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66748-3
Online ISBN: 978-3-540-46769-4
eBook Packages: Springer Book Archive