Abstract.
Constrained principal component analysis (CPCA) incorporates external information into principal component analysis (PCA) of a data matrix. CPCA first decomposes the data matrix according to the external information (external analysis), and then applies PCA to decomposed matrices (internal analysis). The external analysis amounts to projections of the data matrix onto the spaces spanned by matrices of external information, while the internal analysis involves the generalized singular value decomposition (GSVD). Since its original proposal, CPCA has evolved both conceptually and methodologically; it is now founded on firmer mathematical ground, allows a greater variety of decompositions, and includes a wider range of interesting special cases. In this paper we present a comprehensive theory and various extensions of CPCA, which were not fully envisioned in the original paper. The new developments we discuss include least squares (LS) estimation under possibly singular metric matrices, two useful theorems concerning GSVD, decompositions of data matrices into finer components, and fitting higher-order structures. We also discuss four special cases of CPCA; 1) CCA (canonical correspondence analysis) and CALC (canonical analysis with linear constraints), 2) GMANOVA (generalized MANOVA), 3) Lagrange's theorem, and 4) CANO (canonical correlation analysis) and related methods. We conclude with brief remarks on advantages and disadvantages of CPCA relative to other competitors.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Author information
Authors and Affiliations
Additional information
Received: June 23, 2000; revised version: July 9, 2001
Rights and permissions
About this article
Cite this article
Takane, Y., Hunter, M. Constrained Principal Component Analysis: A Comprehensive Theory. AAECC 12, 391–419 (2001). https://doi.org/10.1007/s002000100081
Issue Date:
DOI: https://doi.org/10.1007/s002000100081