Singular value decomposition in additive, multiplicative, and logistic forms
Introduction
Singular value decomposition (SVD) was introduced by Eckart and Young [1] and has become one of the most widely used techniques of computational algebra and multivariate statistical analysis applied for data approximation, reduction and visualization. The SVD, is also known in terms of matrix spectral decomposition, is closely related to principal components and Moore–Penrose generalized matrix inverse. SVD presents a rectangular matrix via a low rank additive combination of the outer products of dual right and left eigenvectors [2], [3], [4], [5]. Sequential sums of these outer products yield the matrix approximation with a needed precision defined by the cumulative share of the eigenvalues in the matrix squared Euclidian norm. SVD is applied to various problems in pattern recognition [6], [7], [8], [9], [10], [11], [12], multidimensional scaling and cluster analysis [13], [14], [15], [16], [17], and perceptual mapping [18], [19]. It is the main tool in correspondence analysis, or dual scaling for categorical data [20], [21], [22], [23], [24], [25], [26], [27]. Numerous SVD applications are known in practical data visualization [28], [29], [30], [31], [32], [33], and in priority evaluations [34], [35], [36], [37], [38], [39].
Although the SVD is extremely useful in various applications, it can produce inadequate results for some data. In scene recognition and reconstruction problems, the positive pixel data matrices are used. Perceptual maps are often constructed by the counts, proportions, or positive share values. The correspondence analysis utilizes the second and third pairs of dual vectors for data plotting, so the matrix approximation of the third rank is implicitly used. In all these problems, if we reconstruct the original data by the first several items of the matrix spectral decomposition, we could easily obtain an approximated matrix with irrelevant negative values (for instance, of a pixel data). In the case of proportions data, the decomposition by singular vectors could yield approximation of a lower rank with the reconstructed elements of beyond the needed interval (outside of 0–100 range for a percent data).
In this work we suggest a convenient modification of SVD that produces a lower rank approximation of data with the desired properties. To obtain an always positive matrix approximation of any rank, we consider the SVD applied to the logarithm transformation of the elements of the original data matrix. This approach corresponds to the minimization of the multiplicative relative deviations of the vectors’ outer product from the original data. We obtain a multiplicative decomposition of the matrix into a product of the exponents powered with the singular values and dual vectors. In another approach, using SVD for the logistically transformed proportion data and minimizing the deviations, we obtain a lower rank approximation with all the matrix elements positive and less than one. This technique is based on minimization by the criterion of the multiplicative relative deviations from the odds of the empirical proportions. We consider also an SVD with additive components that correspond to a data matrix centering in both directions, and an SVD for data similar to regression analysis when besides independent variables there is a dependent variable. In the data transformation to the logarithms for positive matrices, or to the logarithms of the odds for proportion matrices, the element-by-element transformation is performed. In the transformation from the singular value decomposition for the transformed data back to the matrix approximation for the original data, the inverted transformation is performed on the elementwise basis as well. This procedure engages a straightforward transformation of the values of each element, not the matrix as a whole, so no computational problem (such as in taking an exponent of a matrix) occur.
This paper is organized as follows. Section 2 describes the regular SVD technique and suggests its additive, multiplicative, and logistic extensions. Section 3 considers numerical examples, and Section 4 summarizes.
Section snippets
Matrix decomposition in additive, multiplicative, and logistic forms
Let us briefly describe the regular SVD, or matrix approximation by a cumulative sum of the outer products of eigenvectors—see, for instance, [1], [2], [3], [4], [5]. Let X denote a data matrix of order, with elements of th observations () by th variables (). A matrix approximation by r outer products of the vectors iswhere and are elements of the kth pair of vectors and (of th and th order, respectively), are the
Numerical examples
The data for numerical examples are taken from a marketing research project on a pharmaceutical product evaluated by 280 medical practitioners. The product's brands are denoted as Z, ZS, YD, YS, Y, X, and XD (where X, Y, and Z are actual brands, S denotes a syrup version, and D denotes an additional ingredient). The attributes are: a—quick relief, b—safe to use, c—safe to use with other diseases, d—listed on most formularies, e—few side effects, f—one dose for all day, g—long lasting relief,
Summary
We considered the singular value decomposition technique adjusted for the obtaining of matrices with specific features at any step of approximation. It is shown that a positive matrix can be more adequately approximated in the multiplicative SVD by the product of powered spectral decomposition items obtained by the logarithm of the original elements of a matrix. In such an approximation we are guaranteed to preserve the positive entries of the original matrix in the entries of any approximating
Acknowledgements
The authors wish to thank a referee whose valuable comments and suggestions improved and clarified the paper.
References (45)
- et al.
Do singular values contain adequate information for face recognition
Pattern Recognition
(2003) - et al.
Singular value decomposition in AHP
Eur. J. Oper. Res.
(2004) - et al.
Robust estimation of priorities in the AHP
Eur. J. Oper. Res.
(2002) - et al.
Linear methods in multimode data analysis for decision making
Comput. Oper. Res.
(1994) - et al.
The approximation of one matrix by another of lower rank
Psycometrika
(1936) - et al.
Matrix Computations
(1983) Multivariate Observations
(1984)Elements of Statistical ComputingNumerical Computation
(1988)A singular value decompositionthe SVD of a matrix
Coll. Math. J.
(1996)- et al.
Outer product expansions and their uses in digital image processing
Am. Math. Mon.
(1975)