Abstract
Learning function relations or understanding structures of data lying in manifolds embedded in huge dimensional Euclidean spaces is an important topic in learning theory. In this paper we study the approximation and learning by Gaussians of functions defined on a d-dimensional connected compact C ∞ Riemannian submanifold of \({\rm I\!R}^n\) which is isometrically embedded. We show that the convolution with the Gaussian kernel with variance σ provides the uniform approximation order of O(σ s) when the approximated function is Lipschitz s ∈(0, 1]. The uniform normal neighborhoods of a compact Riemannian manifold play a central role in deriving the approximation order. This approximation result is used to investigate the regression learning algorithm generated by the multi-kernel least square regularization scheme associated with Gaussian kernels with flexible variances. When the regression function is Lipschitz s, our learning rate is (log2 m)/m)s/(8 s + 4 d) where m is the sample size. When the manifold dimension d is smaller than the dimension n of the underlying Euclidean space, this rate is much faster compared with those in the literature. By comparing approximation orders, we also show the essential difference between approximation schemes with flexible variances and those with a single variance.
Similar content being viewed by others
References
Aronszajn, N.: Theory of reproducing kernels. Trans. Amer. Math. Soc. 68, 337–404 (1950)
Belkin, M., Niyogi, P.: Towards a theoretical foundation for Laplacian-based manifold methods. In: Auer, P., Meir, R. (eds.) COLT 2005, pp. 486–500 (2005)
Belkin, M., Niyogi, P.: Semi-supervised learning on Riemannian manifolds. Mach. Learn. 56, 209–239 (2004)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)
Boothby, W.M.: An Introduction to Differentiable Manifolds and Riemannian Geometry. Academic, New York (1986)
Bousquet, O., Chapelle, O., Hein, M.: Measure based regularization. In: NIPS (2003)
do Carmo, M.: Riemannian Geometry. Birkhäuser, Boston (1992)
Chen, D.R., Wu, Q., Ying, Y., Zhou, D.X.: Support vector machine soft margin classifiers: error analysis. J. Mach. Learn. Res. 5, 1143–1175 (2004)
Cucker, F., Zhou, D.X.: Learning Theory: An Approximation Theory Viewpoint. Cambridge University Press, Cambridge (2007)
De Vito, E., Caponnetto, A., Rosasco, L.: Model selection for regularized least-squares algorithm in learning theory. Found. Comput. Math. 5, 59–85 (2005)
Ditzian, Z., Totik, V.: Moduli of Smoothness. Springer, New York (1987)
Donoho, D., Grimes, C.: Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA 100, 5591–5596 (2003)
Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and suport vector machines. Adv. Comput. Math. 13, 1–50 (2000)
Gine, E., Koltchinskii, V.: Empirical graph Laplacian approximation of Laplace–Beltrami operators: large sample results. IMS Lecture Notes-monograph Series High Dimensional Probability 51, 238–259 (2006)
Hardin, D., Tsamardinos, I., Aliferis, C.F.: A theoretical characterization of linear SVM-based feature selection. Proc. of the 21st Int. Conf. on Machine Learning, Banff, Canada (2004)
Lee, J.M.: Riemannian Manifolds, Springer, New York (1997)
Micchelli, C.A., Pontil, M., Wu, Q. Zhou, D.X.: Error bounds for learning the kernel. In: Pattern Analysis, Statistical Modelling and Computational Learning (PASCAL), Research note no. 05/09, pp. 1–14 (2005)
Mukherjee, S., Wu, Q., Zhou, D.X.: Learning gradients and feature selection on manifolds. Preprint (2007)
Smale, S., Zhou, D.X.: Estimating the approximation error in learning theory. Anal. Appl. 1, 17–41 (2003)
Smale, S., Zhou, D.X.: Learning theory estimates via integral operators and their applications. Constr. Approx. (2007) doi:10.1007/s00365-006-0659-y
von Luxburg, U., Belkin, M., Bousquet, O.: Consistency of spectral clustering. Ann. Statist. (2007) (in press)
Wu, Q., Ying, Y., Zhou, D.X.: Multi-kernel regularized classifiers. J. Complexity 23, 108–134 (2007)
Wu, Q., Zhou, D.X.: SVM soft margin classifiers: linear programming versus quadratic programming. Neural Comput. 17, 1160–1187 (2005)
Ye, G.B., Zhou, D.X.: Fully online classification by regularization. Appl. Comput. Harmon. Anal. (2007) doi:10.1016/j.acha.2006.12.001
Ying, Y., Zhou, D.X.: Learnability of Gaussians with flexible variances. J. Mach. Learn. Res. 8, 249–276 (2007)
Zhou, D.X.: The covering number in learning theory. J. Complexity 18, 739–767 (2002)
Zhou, D.X.: Capacity of reproducing kernel spaces in learning theory. IEEE Trans. Inform Theory 49, 1743–1752 (2003)
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Charles A. Micchelli.
Supported partially by the Research Grants Council of Hong Kong [Project No. CityU 103405], City University of Hong Kong [Project No. 7001983], National Science Fund for Distinguished Young Scholars of China [Project No. 10529101], and National Basic Research Program of China [Project No. 973-2006CB303102].
Rights and permissions
About this article
Cite this article
Ye, GB., Zhou, DX. Learning and approximation by Gaussians on Riemannian manifolds. Adv Comput Math 29, 291–310 (2008). https://doi.org/10.1007/s10444-007-9049-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10444-007-9049-0
Keywords
- Learning theory
- Reproducing kernel Hilbert spaces
- Gaussian kernels
- Approximation
- Riemannian manifolds
- Multi-kernel least square regularization scheme