Skip to main content
Log in

Learning and approximation by Gaussians on Riemannian manifolds

  • Published:
Advances in Computational Mathematics Aims and scope Submit manuscript

Abstract

Learning function relations or understanding structures of data lying in manifolds embedded in huge dimensional Euclidean spaces is an important topic in learning theory. In this paper we study the approximation and learning by Gaussians of functions defined on a d-dimensional connected compact C Riemannian submanifold of \({\rm I\!R}^n\) which is isometrically embedded. We show that the convolution with the Gaussian kernel with variance σ provides the uniform approximation order of O(σ s) when the approximated function is Lipschitz s ∈(0, 1]. The uniform normal neighborhoods of a compact Riemannian manifold play a central role in deriving the approximation order. This approximation result is used to investigate the regression learning algorithm generated by the multi-kernel least square regularization scheme associated with Gaussian kernels with flexible variances. When the regression function is Lipschitz s, our learning rate is (log2 m)/m)s/(8 s + 4 d) where m is the sample size. When the manifold dimension d is smaller than the dimension n of the underlying Euclidean space, this rate is much faster compared with those in the literature. By comparing approximation orders, we also show the essential difference between approximation schemes with flexible variances and those with a single variance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Aronszajn, N.: Theory of reproducing kernels. Trans. Amer. Math. Soc. 68, 337–404 (1950)

    Article  MATH  MathSciNet  Google Scholar 

  2. Belkin, M., Niyogi, P.: Towards a theoretical foundation for Laplacian-based manifold methods. In: Auer, P., Meir, R. (eds.) COLT 2005, pp. 486–500 (2005)

  3. Belkin, M., Niyogi, P.: Semi-supervised learning on Riemannian manifolds. Mach. Learn. 56, 209–239 (2004)

    Article  MATH  Google Scholar 

  4. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)

    Article  MATH  Google Scholar 

  5. Boothby, W.M.: An Introduction to Differentiable Manifolds and Riemannian Geometry. Academic, New York (1986)

    MATH  Google Scholar 

  6. Bousquet, O., Chapelle, O., Hein, M.: Measure based regularization. In: NIPS (2003)

  7. do Carmo, M.: Riemannian Geometry. Birkhäuser, Boston (1992)

    MATH  Google Scholar 

  8. Chen, D.R., Wu, Q., Ying, Y., Zhou, D.X.: Support vector machine soft margin classifiers: error analysis. J. Mach. Learn. Res. 5, 1143–1175 (2004)

    MathSciNet  Google Scholar 

  9. Cucker, F., Zhou, D.X.: Learning Theory: An Approximation Theory Viewpoint. Cambridge University Press, Cambridge (2007)

    MATH  Google Scholar 

  10. De Vito, E., Caponnetto, A., Rosasco, L.: Model selection for regularized least-squares algorithm in learning theory. Found. Comput. Math. 5, 59–85 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  11. Ditzian, Z., Totik, V.: Moduli of Smoothness. Springer, New York (1987)

    MATH  Google Scholar 

  12. Donoho, D., Grimes, C.: Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA 100, 5591–5596 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  13. Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and suport vector machines. Adv. Comput. Math. 13, 1–50 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  14. Gine, E., Koltchinskii, V.: Empirical graph Laplacian approximation of Laplace–Beltrami operators: large sample results. IMS Lecture Notes-monograph Series High Dimensional Probability 51, 238–259 (2006)

    Article  MathSciNet  Google Scholar 

  15. Hardin, D., Tsamardinos, I., Aliferis, C.F.: A theoretical characterization of linear SVM-based feature selection. Proc. of the 21st Int. Conf. on Machine Learning, Banff, Canada (2004)

  16. Lee, J.M.: Riemannian Manifolds, Springer, New York (1997)

    MATH  Google Scholar 

  17. Micchelli, C.A., Pontil, M., Wu, Q. Zhou, D.X.: Error bounds for learning the kernel. In: Pattern Analysis, Statistical Modelling and Computational Learning (PASCAL), Research note no. 05/09, pp. 1–14 (2005)

  18. Mukherjee, S., Wu, Q., Zhou, D.X.: Learning gradients and feature selection on manifolds. Preprint (2007)

  19. Smale, S., Zhou, D.X.: Estimating the approximation error in learning theory. Anal. Appl. 1, 17–41 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  20. Smale, S., Zhou, D.X.: Learning theory estimates via integral operators and their applications. Constr. Approx. (2007) doi:10.1007/s00365-006-0659-y

  21. von Luxburg, U., Belkin, M., Bousquet, O.: Consistency of spectral clustering. Ann. Statist. (2007) (in press)

  22. Wu, Q., Ying, Y., Zhou, D.X.: Multi-kernel regularized classifiers. J. Complexity 23, 108–134 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  23. Wu, Q., Zhou, D.X.: SVM soft margin classifiers: linear programming versus quadratic programming. Neural Comput. 17, 1160–1187 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  24. Ye, G.B., Zhou, D.X.: Fully online classification by regularization. Appl. Comput. Harmon. Anal. (2007) doi:10.1016/j.acha.2006.12.001

  25. Ying, Y., Zhou, D.X.: Learnability of Gaussians with flexible variances. J. Mach. Learn. Res. 8, 249–276 (2007)

    MathSciNet  Google Scholar 

  26. Zhou, D.X.: The covering number in learning theory. J. Complexity 18, 739–767 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  27. Zhou, D.X.: Capacity of reproducing kernel spaces in learning theory. IEEE Trans. Inform Theory 49, 1743–1752 (2003)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ding-Xuan Zhou.

Additional information

Communicated by Charles A. Micchelli.

Supported partially by the Research Grants Council of Hong Kong [Project No. CityU 103405], City University of Hong Kong [Project No. 7001983], National Science Fund for Distinguished Young Scholars of China [Project No. 10529101], and National Basic Research Program of China [Project No. 973-2006CB303102].

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ye, GB., Zhou, DX. Learning and approximation by Gaussians on Riemannian manifolds. Adv Comput Math 29, 291–310 (2008). https://doi.org/10.1007/s10444-007-9049-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10444-007-9049-0

Keywords

Mathematics Subject Classifications (2000)

Navigation