Abstract
In this work we use kernel subspace techniques to perform feature extraction. The projections of the data onto the coordinates of the high-dimensional space created by the kernel function are called features. The basis vectors to project the data depend on the eigendecomposition of the kernel matrix which might become very high-dimensional in case of a large training set. Nevertheless only the largest eigenvalues and corresponding eigenvectors are used to extract relevant features. In this work, we present low-rank approximations to the kernel matrix based on the Nyström method. Numerical simulations will then be used to demonstrate the Nyström extension method applied to feature extraction and classification. The performance of the presented methods is demonstrated using the USPS data set.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Achlioptas, D., McSherry, F., Schölkopf, B.: Sampling techniques for kernel methods. In: Advances in Neural Information Processing Systems, pp. 335–342. MIT Press, Cambridge (2002)
Bach, F.R., Jordan, M.I.: Kernel independent component analysis. Journal of Machine Learning Research 3, 1–48 (2002)
Baudat, G., Anouar, F.: Feature vector selection and projection using kernels. Neurocomputing 55, 21–38 (2003)
Cawley, G.C., Talbot, N.L.C.: Efficient formation of a basis in a kernel induced feature space. In: Verleysen, M. (ed.) European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 1–6 (2002)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. (2001)
Fowlkes, C., Belongie, S., Chung, F., Malik, J.: Spectral grouping using the nyström method. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(2), 214–225 (2004)
Franc, V., Hlaváč, V.: Greedy algorithm for a training set reduction in the kernel methods. In: 10th International Conference on Computer Analysis of Images and Patterns, Groningen, Holland, pp. 426–433. Springer (2003)
Müller, K.-R., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An introduction to kernel-based algorithms. IEEE Transactions on Neural Networks 12(2), 181–202 (2001)
Schölkopf, B., Smola, A., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)
Williams, C.K.I., Seeger, M.: Using the nyström method to speed up kernel machines. In: Advances in Neural Information Processing Systems, pp. 682–688. MIT Press, Cambridge (2000)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Teixeira, A.R., Tomé, A.M., Lang, E.W. (2008). Feature Extraction Using Low-Rank Approximations of the Kernel Matrix. In: Campilho, A., Kamel, M. (eds) Image Analysis and Recognition. ICIAR 2008. Lecture Notes in Computer Science, vol 5112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69812-8_40
Download citation
DOI: https://doi.org/10.1007/978-3-540-69812-8_40
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-69811-1
Online ISBN: 978-3-540-69812-8
eBook Packages: Computer ScienceComputer Science (R0)