Skip to main content

On the Eigenspectrum of the Gram Matrix and Its Relationship to the Operator Eigenspectrum

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2533))

Abstract

In this paper we analyze the relationships between the eigenvalues of them х m Gram matrix K for a kernel k(·, ·) corresponding to a sample x1, . . . , xm drawn from a density p(x) and and the eigenvalues of the corresponding continuous eigenproblem. We bound the differences between the two spectra and provide a performance bound on kernel PCA.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. T. W. Anderson. Asymptotic Theory for Principal Component Analysis. Annals of Mathematical Statistics, 34(1):122–148, 1963.

    Article  MATH  MathSciNet  Google Scholar 

  2. M. Anthony and P. Bartlett. Learning in Neural Networks: Theoretical Foundations. Cambridge, England: Cambridge University Press, 1999.

    Google Scholar 

  3. C. T. H. Baker. The numerical treatment of integral equations. Clarendon Press, Oxford, 1977.

    MATH  Google Scholar 

  4. S. Brin and L. Page. The anatomy of a large-scale hypertextual (web) search engine. In Proceedings of the Seventh International World Wide Web Conference, 1998.

    Google Scholar 

  5. N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge University Press, 2000.

    Google Scholar 

  6. Nello Cristianini, Huma Lodhi, and John Shawe-Taylor. Latent semantic kernels for feature selection. Technical Report NC-TR-00-080, NeuroCOLT Working Group, http://www.neurocolt.org, 2000.

  7. M. L. Eaton and D. E. Tyler. On Wielandt’s Inequality and Its Application to the Asymptotic Distribution of the Eigenvalues of a Random Symmetric Matrix. Annals of Statistics, 19(1):260–271, 1991.

    Article  MATH  MathSciNet  Google Scholar 

  8. A. T. James. The distribution of the latent roots of the covariance matrix. Annals of Math. Stat., 31:151–158, 1960.

    Article  MATH  Google Scholar 

  9. J. Kleinberg. Authoritative sources in a hyperlinked environment. In Proceedings of 9th ACM-SIAM Symposium on Discrete Algorithms, 1998.

    Google Scholar 

  10. V. Koltchinskii and E. Gine. Random matrix approximation of spectra of integral operators. Bernoulli, 6(1):113–167, 2000.

    Article  MATH  MathSciNet  Google Scholar 

  11. D. N. Lawley. Tests of Significance for the Latent Roots of Covariance and Correlation Matrices. Biometrika, 43(1/2):128–136, 1956.

    Article  MATH  MathSciNet  Google Scholar 

  12. C. McDiarmid. On the method of bounded differences. In Surveys in Combinatorics 1989, pages 148–188. Cambridge University Press, 1989.

    Google Scholar 

  13. S. Mika, B. Schölkopf, A. Smola, K.-R. Müller, M. Scholz, and G. Rätsch. Kernel PCA and de-noising in feature spaces. In Advances in Neural Information Processing Systems 11, 1998.

    Google Scholar 

  14. Andrew Y. Ng, Alice X. Zheng, and Michael I. Jordan. Link analysis, eigenvectors and stability. In To appear in the Seventeenth International Joint Conference on Artificial Intelligence (IJCAI-01), 2001.

    Google Scholar 

  15. J. Shawe-Taylor, N. Cristianini, and J. Kandola. On the Concentration of Spectral Properties. In T. G. Diettrich, S. Becker, and Z. Ghahramani, editors, Advances in Neural Information Processing Systems 14. MITPress, 2002.

    Google Scholar 

  16. C. M. Waternaux. Asymptotic Distribution of the Sample Roots for a Nonnormal Population. Biometrika, 63(3):639–645, 1976.

    Article  MATH  MathSciNet  Google Scholar 

  17. C. K. I. Williams and M. Seeger. The Effect of the Input Density Distribution on Kernel-based Classifiers. In P. Langley, editor, Proceedings of the Seventeenth International Conference on Machine Learning (ICML 2000). Morgan Kaufmann, 2000.

    Google Scholar 

  18. H. Zhu, C. K. I. Williams, R. J. Rohwer, and M. Morciniec. Gaussian regression and optimal finite dimensional linear models. In C. M. Bishop, editor, Neural Networks and Machine Learning. Springer-Verlag, Berlin, 1998.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Shawe-Taylor, J., Williams, C., Cristianini, N., Kandola, J. (2002). On the Eigenspectrum of the Gram Matrix and Its Relationship to the Operator Eigenspectrum. In: Cesa-Bianchi, N., Numao, M., Reischuk, R. (eds) Algorithmic Learning Theory. ALT 2002. Lecture Notes in Computer Science(), vol 2533. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36169-3_4

Download citation

  • DOI: https://doi.org/10.1007/3-540-36169-3_4

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-00170-6

  • Online ISBN: 978-3-540-36169-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics