Abstract
Principal component analysis (PCA) is widely used in signal processing, pattern recognition, etc. PCA was extended to the relative PCA (RPCA). RPCA provides principal components of a signal while suppressing effects of other signals. PCA was also extended to the kernel PCA (KPCA). By using a mapping from the original space to a higher dimensional space and its kernel, we can perform PCA in the higher dimensional space. In this paper, we propose the kernel RPCA (KRPCA) and give its solution. Similarly to KPCA, the order of matrices that we should calculate for the solution is the number of samples, that is ‘kernel trick’. We provide experimental results of an application to pattern recognition in order to show the advantages of KRPCA over KPCA.
Chapter PDF
Similar content being viewed by others
References
Watanabe, S., Pakvasa, N.: Subspace method in pattern recognition.In: Proc. 1st Int. J. Conf on Pattern Recognition, Washington DC,pp. 25–32 (1973)
Oja, E.: Subspace Methods of Pattern Recognition. Research Studies Press, Hertfordshire (1983)
Diamantaras, K.I., Kung, S.Y.: Principal Component Neural Networks. John Wiley & Sons, Inc., Yew York (1996)
Yamashita, Y., Ogawa, H.: Relative Karhunen-Loève transform. IEEE Trans. on Signal Processing 44, 371–378 (1996)
Ikeno, Y., Yamashita, Y., Ogawa, H.: Relative Karhunen-Loève transform method for pattern recognition. In: Proc. of the 14th International Conference on Pattern Recognition, Brisben, Austraria, vol. 2, pp. 1031–1033 (1998)
Scharf, L.: The SVD and reduced rank signal processing. Signal Processing 25, 113–133 (1991)
Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annal Eugenics 7, 179–188 (1936)
Schölkopf, B., Smola, A., Müller, K.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)
Mika, S., Schölkopf, B., Smola, A.J., Müller, K.R., Scholz, M., Rätsch, G.: Kernel PCA and de-noising in feature spaces. In: Kearns, M.S., Solla, S.A., Cohn, D.A. (eds.) Advances in Neural Information Processing Systems, vol. 11, pp. 536–542. MIT Press, Cambridge (1999)
Schölkopf, B., Mika, S., Burges, C., Knirsch, P., Müller, K.R., Rätsch, G., Smola, A.: Input space vs. feature space in kernel-based methods. IEEE Transactions on Neural Networks 10, 1000–1017 (1999)
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Smola, A.J., Müller, K.R.: Invariant feature extraction and classification in kernel spaces. In: Kearns, M.S., Solla, S.A., Cohn, D.A. (eds.) Advances in Neural Information Processing Systems, vol. 12, pp. 526–532. MIT Press, Cambridge (2000)
Maeda, E., Murase, H.: Kernel based nonlinear subspace method for multi-category classification. Tech. Rep., Information Science Laboratory ISRL-98-1 (1998)
Tsuda, K.: Subspace classifier in the Hilbert space. Pattern Recognition Letters 20, 513–519 (1999)
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.R.: Fisher discriminant analysis with kernels. In Y.-H. Hu, J. Larsen, E.W., Douglas, S., eds.: Neural Networks for Signal Processing IX, IEEE (1999) 41–48
Mika, S., Rätsch, G., Müller, K.R.: A mathematical programming approach to the kernel Fisher algorithm. In: Leen, T.K., Tresp, T.D.,, V. (eds.) Advances in Neural Information Processing Systems 13, pp. 591–597. MIT Press, Cambridge (2001)
Mika, S., Smola, A.J., Schölkopf, B.: An improved training algorithm for kernel Fisher discriminants. In: Jaakkola, T., Richardson, T. (eds.) Proceedings of Eighth International Workshop on Artificial Intelligence and Statistics, pp. 98–104. Morgan Kaufmann, San Francisco (2001)
Schatten, R.: Norm Ideals of Completely Continuous Operators. Springer-Verlag, Berlin (1970)
Ben-Israel, A., Greville, T.N.E.: Generalized Inverses: Theory and Applications. JohnWiley & Sons, New York (1974)
Wakabayashi, T., Tsuruoka, S., Miyake, F.K., Y.: Increasing the feature size in handwritten numeral recognition to improve accuracy. Systems and Computers in Japan (Scripta Technica) 26, 35–44 (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Washizawa, Y., Hikida, K., Tanaka, T., Yamashita, Y. (2004). Kernel Relative Principal Component Analysis for Pattern Recognition. In: Fred, A., Caelli, T.M., Duin, R.P.W., Campilho, A.C., de Ridder, D. (eds) Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2004. Lecture Notes in Computer Science, vol 3138. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-27868-9_122
Download citation
DOI: https://doi.org/10.1007/978-3-540-27868-9_122
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22570-6
Online ISBN: 978-3-540-27868-9
eBook Packages: Springer Book Archive