Abstract
This paper presents a novel dimension reduction algorithm for kernel based classification. In the feature space, the proposed algorithm maximizes the ratio of the squared between-class distance and the sum of the within-class variances of the training samples for a given reduced dimension. This algorithm has lower complexity than the recently reported kernel dimension reduction (KDR) for supervised learning. We conducted several simulations with large training datasets, which demonstrate that the proposed algorithm has similar performance or is marginally better compared with KDR whilst having the advantage of computational efficiency. Further, we applied the proposed dimension reduction algorithm to face recognition in which the number of training samples is very small. This proposed face recognition approach based on the new algorithm outperforms the eigenface approach based on the principal component analysis (PCA), when the training data is complete, that is, representative of the whole dataset.
Similar content being viewed by others
References
Blake, C. L. and Merz, C. J.: Uci Repository of Machine Learning Databases [http://www.ics.uci.edu/~ mlearn/MLRepository.html]. University of California, Irvine, CA, Dept. of Information and Computer Science, 1998.
Boser, B., Guyon, I. and Vapnik, V.: A training algorithm for optimal margin classifiers. Proc. Of the Fifth Annual Workshop on Computational Learning Theory, pp. 144–152 (1992).
Fukumizu K., Bach F.R., Jordan M.I. (2004): Dimensionality reduction for supervised learning with reproducing kernel hilbert spaces. Journal of Machine Learning Research 5, 73–99
Manton J.H. (2002): Optimization algorithms exploiting unitary constraints. IEEE Transactions Signal Processing 50, 635–650
McLachlan, G. J.: Discriminant Analysis and Statistical Pattern Recognition. John Wiley & Sons.
Mika S., Patsch G., Weston J., Scholkopf B., Muller K. (1999): Fisher discriminant analysis with kernels. Neural Networks for Signal Processing IX, IEEE, 41–48
Pan V. (1984): How can we speed up matrix multiplication?. SIAM Review 26, 393–415
Suykens J.A.K., Van Gestel T., De Brabanter J., De Moor B., Vandewalle J. (2002). Least Squares Support Vector Machines. World Scientific, Singapore
Suykens J.A.K., Vandewalle J. (1999): Least squares support vector machine classifiers. Neural Processing Letters 9, 293–300
Tiahyadi, R.: Investigations Into pca and dct Based Recognition Algorithms. Master Thesis, Curtin University of Technology (2004).
Turk M., Pentland A. (1991): Eigenfaces for recognition. Journal of Cognitive Neuroscience 13, 71–86
Van Gestel T., Suykens J.A.K., Baesens B., Viaene S., Vanthienen J., Dedene G., De Moor B., Vandewalle J. (2004): Benchmarking least squares support vector machine classifiers. Machine Learning 54, 5–32
Vapnik V. (1995). The Nature of Statistical Learning Theory. Spring-Verlag, New-York
WebSite: [online] http://cvc.yale.edu/projects/yalefaces/yalefaces.html. YALE University Face Database (2004)
Weston J., Mukherjee S., Chapelle O., Pontil M., Poggio T., Vapnik V. (2001): Feature selection for svms. Advances in Neural Information Processing Systems 13, 668–674
Xiong H., Swamy S., Ahmad M. (2005): Optimizing the kernel in the empirical feature space. IEEE Transactions on Neural Networks 16, 460–474
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
An, S., Liu, W., Venkatesh, S. et al. A Fast Feature-based Dimension Reduction Algorithm for Kernel Classifiers. Neural Process Lett 24, 137–151 (2006). https://doi.org/10.1007/s11063-006-9016-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-006-9016-7