Abstract
In the traditional canonical correlation analysis (CCA)-based face recognition methods, the size of sample is always smaller than the dimension of sample. This problem is so-called the small sample size (SSS) problem. In order to solve this problem, a new supervised learning method called two-dimensional CCA (2DCCA) is developed in this paper. Different from traditional CCA method, 2DCCA directly extracts the features from image matrix rather than matrix to vector transformation. In practice, the covariance matrix extracted by 2DCCA is always full rank. Hence, the SSS problem can be effectively dealt with by this new developed method. The theory foundation of 2DCCA method is first developed, and the construction method for the class-membership matrix Y which is used to precisely represent the relationship between samples and classes in the 2DCCA framework is then clarified. Simultaneously, the analytic form of the generalized inverse of such class-membership matrix is derived. From our experiment results on face recognition, we clearly find that not only the SSS problem can be effectively solved, but also better recognition performance than several other CCA-based methods has been achieved.
Similar content being viewed by others
References
Hotelling H (1936) Relations between two sets of variates. Biometrika 28:321–377
Borga M (2001) Canonical correlation a tutorial http://www.imt.liu.se/~magnus/cca/
Borga M (1998) Learning multidimensional signal processing. PhD thesis, Linköping University, Sweden
Yu S (2001) Direct blind channel equalization via the programmable canonical correlation analysis. Signal Processing 81:1715–1724
Becker S (1996) Mutual information maximization: models of cortical selforganization: network. Comput Neural Syst 7:7–31
Melzer T, Reiter M, Bischof H (2003) Appearance models based on kernel canonical correlation analysis. Pattern Recognit 36:1961–1971
Sun Q, Zeng S, Liu Y, Heng P, Xia D (2005) A new method of feature fusion and its application in image recognition. Pattern Recognit 38:2437–2448
Zheng W, Zhou X, Zou C, Zhao L (2006) Facial expression recognition using kernel canonical correlation analysis (KCCA). IEEE Trans Neural Networks 17(1):233–238
Fukunaga K (1990) Introduction to statistical pattern recognition. Academic Press, New York
Raudys SJ, Jain AK (1991) Small sample size effects in statistical pattern recognition: recommendations for practitioners. IEEE Trans Pattern Anal Mach Intell 13(3):252–264
Friedman J (1989) Regularized discriminant analysis. J Am Stat Assoc 84(405):65–175
Hong ZQ, Yang JY (1991) Optimal discriminant plane for a small number of samples and design method of classifier on the plane. Pattern Recognit 24(4):317–324
Chen LF, Liao HYM, Ko MT (2000) A new LDA based face recognition system which can solve the small sample size problem. Pattern Recognit 33:1713–1726
Yang J, Yang JY (2003) Why can LDA be performed in PCA transformed space? Pattern Recognit 36(2):563–566
Yang J, Zhang D, Frangi AF, Yang JY (2004) Two-dimensional PCA: a new approach to appearance based face representation and recognition. IEEE Trans PAMI 26(1):131–137
Li M, Yuan BZ (2005) 2D-LDA: a statistical linear discriminant analysis for image matrix. Pattern Recognit Lett 26:527–532
Barker M (2000) Partial least squares for discrimination. PhD Dissertation, University of Kentucky
Graybill FA (1976) Theory and application of the linear model. Wadsworth and Brooks/Cole, Pacific Grove, pp 31–33
Author information
Authors and Affiliations
Corresponding author
Appendix: the generalized inverse of class-membership covariance matrix
Appendix: the generalized inverse of class-membership covariance matrix
From (5), we can obtain:
where \( \left( {I_{l \times N} - \frac{1}{N}R_{N} R_{N}^{\text{T}} } \right) \) is centering projector, I l×N is the (l × N) × (l × N) identity matrix, and R T N is a matrix, which size is h × (l × N), composed of Nnumber of Q matrix
and
where \( S_{i} = n_{i} QQ^{\text{T}} . \) Because M is diagonal matrix, the generalized inverse M G is also diagonal matrix and the non-zero elements in the diagonal is equal to the reciprocal of the corresponding elements in diagonal of M. From literature [17, 18], the generalized inverse of Σ ZZ is \( \Upsigma_{ZZ}^{G} = (({1 \mathord{\left/ {\vphantom {1 {n_{C} }}} \right. \kern-\nulldelimiterspace} {n_{C} }})R_{C - 1} R_{C - 1}^{\text{T}} + M^{G} ). \)
From (19), the matrix Σ YY can be written as
where
So the generalized inverse of class-membership covariance matrix is as following
Rights and permissions
About this article
Cite this article
Sun, N., Ji, Zh., Zou, Cr. et al. Two-dimensional canonical correlation analysis and its application in small sample size face recognition. Neural Comput & Applic 19, 377–382 (2010). https://doi.org/10.1007/s00521-009-0291-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-009-0291-x