Skip to main content
Log in

Two-dimensional canonical correlation analysis and its application in small sample size face recognition

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In the traditional canonical correlation analysis (CCA)-based face recognition methods, the size of sample is always smaller than the dimension of sample. This problem is so-called the small sample size (SSS) problem. In order to solve this problem, a new supervised learning method called two-dimensional CCA (2DCCA) is developed in this paper. Different from traditional CCA method, 2DCCA directly extracts the features from image matrix rather than matrix to vector transformation. In practice, the covariance matrix extracted by 2DCCA is always full rank. Hence, the SSS problem can be effectively dealt with by this new developed method. The theory foundation of 2DCCA method is first developed, and the construction method for the class-membership matrix Y which is used to precisely represent the relationship between samples and classes in the 2DCCA framework is then clarified. Simultaneously, the analytic form of the generalized inverse of such class-membership matrix is derived. From our experiment results on face recognition, we clearly find that not only the SSS problem can be effectively solved, but also better recognition performance than several other CCA-based methods has been achieved.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Hotelling H (1936) Relations between two sets of variates. Biometrika 28:321–377

    MATH  Google Scholar 

  2. Borga M (2001) Canonical correlation a tutorial http://www.imt.liu.se/~magnus/cca/

  3. Borga M (1998) Learning multidimensional signal processing. PhD thesis, Linköping University, Sweden

  4. Yu S (2001) Direct blind channel equalization via the programmable canonical correlation analysis. Signal Processing 81:1715–1724

    Article  MATH  Google Scholar 

  5. Becker S (1996) Mutual information maximization: models of cortical selforganization: network. Comput Neural Syst 7:7–31

    Article  MATH  Google Scholar 

  6. Melzer T, Reiter M, Bischof H (2003) Appearance models based on kernel canonical correlation analysis. Pattern Recognit 36:1961–1971

    Article  MATH  Google Scholar 

  7. Sun Q, Zeng S, Liu Y, Heng P, Xia D (2005) A new method of feature fusion and its application in image recognition. Pattern Recognit 38:2437–2448

    Article  Google Scholar 

  8. Zheng W, Zhou X, Zou C, Zhao L (2006) Facial expression recognition using kernel canonical correlation analysis (KCCA). IEEE Trans Neural Networks 17(1):233–238

    Article  Google Scholar 

  9. Fukunaga K (1990) Introduction to statistical pattern recognition. Academic Press, New York

    MATH  Google Scholar 

  10. Raudys SJ, Jain AK (1991) Small sample size effects in statistical pattern recognition: recommendations for practitioners. IEEE Trans Pattern Anal Mach Intell 13(3):252–264

    Article  Google Scholar 

  11. Friedman J (1989) Regularized discriminant analysis. J Am Stat Assoc 84(405):65–175

    Article  Google Scholar 

  12. Hong ZQ, Yang JY (1991) Optimal discriminant plane for a small number of samples and design method of classifier on the plane. Pattern Recognit 24(4):317–324

    Article  MathSciNet  Google Scholar 

  13. Chen LF, Liao HYM, Ko MT (2000) A new LDA based face recognition system which can solve the small sample size problem. Pattern Recognit 33:1713–1726

    Article  Google Scholar 

  14. Yang J, Yang JY (2003) Why can LDA be performed in PCA transformed space? Pattern Recognit 36(2):563–566

    Article  Google Scholar 

  15. Yang J, Zhang D, Frangi AF, Yang JY (2004) Two-dimensional PCA: a new approach to appearance based face representation and recognition. IEEE Trans PAMI 26(1):131–137

    Google Scholar 

  16. Li M, Yuan BZ (2005) 2D-LDA: a statistical linear discriminant analysis for image matrix. Pattern Recognit Lett 26:527–532

    Article  Google Scholar 

  17. Barker M (2000) Partial least squares for discrimination. PhD Dissertation, University of Kentucky

  18. Graybill FA (1976) Theory and application of the linear model. Wadsworth and Brooks/Cole, Pacific Grove, pp 31–33

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ning Sun.

Appendix: the generalized inverse of class-membership covariance matrix

Appendix: the generalized inverse of class-membership covariance matrix

From (5), we can obtain:

$$ \begin{aligned} \Upsigma_{YY} & = \sum\limits_{i = 1}^{C} {\sum\limits_{j = 1}^{{n_{i} }} {(y_{ij} - \overline{Y} )} } (y_{ij} - \overline{Y} )^{\text{T}} \\ & = Y\left(I_{l \times N} - \frac{1}{N}R_{N} R_{N}^{\text{T}} \right)Y^{\text{T}} = YY^{\text{T}} - \frac{1}{N}YR_{N} R_{N}^{\text{T}} Y^{\text{T}} = U - \frac{1}{N}UR_{C} R_{C}^{\text{T}} U^{\text{T}} x \\ \end{aligned} $$
(15)
$$ \begin{aligned} \Upsigma_{ZZ} & = \sum\limits_{i = 1}^{C} {\sum\limits_{j = 1}^{{n_{i} }} {(z_{ij} - \overline{Z} )} } (z_{ij} - \overline{Z} )^{\text{T}} \\ & = Z\left(I_{l \times N} - \frac{1}{N}R_{N} R_{N}^{\text{T}} \right)Z^{\text{T}} = ZZ^{\text{T}} - \frac{1}{N}ZR_{N} R_{N}^{\text{T}} Z^{\text{T}} = M - \frac{1}{N}MR_{C - 1} R_{C - 1}^{\text{T}} M \\ \end{aligned} $$
(16)

where \( \left( {I_{l \times N} - \frac{1}{N}R_{N} R_{N}^{\text{T}} } \right) \) is centering projector, I l×N is the (l × N) × (l × N) identity matrix, and R T N is a matrix, which size is h × (l × N), composed of Nnumber of Q matrix

$$ \left( {I_{l \times N} - \frac{1}{N}R_{N} R_{N}^{\text{T}} } \right) = \left( {\left[ {\begin{array}{*{20}c} 1 & 0 & \cdots & 0 \\ 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & 1 \\ \end{array} } \right] - \frac{1}{N}\left[ {\begin{array}{*{20}c} {Q^{\text{T}} } \\ {Q^{\text{T}} } \\ \vdots \\ {Q^{T} } \\ \end{array} } \right]\left[ {\begin{array}{*{20}c} Q & Q & \cdots & Q \\ \end{array} } \right]} \right) $$
(17)

and

$$ M = \left[ {\begin{array}{*{20}c} {S_{1} } & 0 & 0 & 0 \\ 0 & {S_{2} } & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & {S_{C - 1} } \\ \end{array} } \right]_{{\{ h \times (C - 1)\} \times \{ h \times (C - 1)\} }} $$
(18)
$$ U = \left[ {\begin{array}{*{20}c} M & 0 \\ 0 & {S_{C} } \\ \end{array} } \right]_{(h \times C) \times (h \times C)} $$
(19)

where \( S_{i} = n_{i} QQ^{\text{T}} . \) Because M is diagonal matrix, the generalized inverse M G is also diagonal matrix and the non-zero elements in the diagonal is equal to the reciprocal of the corresponding elements in diagonal of M. From literature [17, 18], the generalized inverse of Σ ZZ is \( \Upsigma_{ZZ}^{G} = (({1 \mathord{\left/ {\vphantom {1 {n_{C} }}} \right. \kern-\nulldelimiterspace} {n_{C} }})R_{C - 1} R_{C - 1}^{\text{T}} + M^{G} ). \)

From (19), the matrix Σ YY can be written as

$$ \Upsigma_{YY} = \left[ {\begin{array}{*{20}c} {\Upsigma_{ZZ} } & V \\ {V^{T} } & {\frac{{N - n_{C} }}{N}S_{C} } \\ \end{array} } \right]_{(h \times C) \times (h \times C)} $$
(20)

where

$$ V = \frac{{ - n_{C} }}{N}MR_{C - 1} $$
(21)

So the generalized inverse of class-membership covariance matrix is as following

$$ \Upsigma_{YY}^{g} = \left[ {\begin{array}{*{20}l} {\Upsigma_{ZZ}^{g} } & {0_{{\{ h \times (C - 1)\} h}} } \\ {0_{{h \times \{ h \times (C - 1)\} }} } & {0_{h \times h} } \\ \end{array} } \right]_{(h \times C) \times (h \times C)} $$
(22)

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sun, N., Ji, Zh., Zou, Cr. et al. Two-dimensional canonical correlation analysis and its application in small sample size face recognition. Neural Comput & Applic 19, 377–382 (2010). https://doi.org/10.1007/s00521-009-0291-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-009-0291-x

Keywords

Navigation