Comments on “An analytical algorithm for generalized low-rank approximations of matrices”
Introduction
The low rank approximation is a useful technique for dimension reduction, which is widely used in image compression, information retrieval, pattern recognition and so on. However, when applied to high-dimensional data, this technique meets with practical computational limits. Ye [1] proposed a novel approach to reduce the computational cost, which is based on a new data representation model. An iterative algorithm for these generalized low rank approximations of matrices (GLRAM) was also given, and it is proven to have superior performance over the traditional singular value decomposition (SVD) based methods.
Liang and Shi recently proposed a new algorithm for GLRAM in Ref. [2]. They concluded that their scheme is an analytical algorithm and has no iterative steps compared with the Ye's scheme. Unfortunately, as we will show shortly, this scheme is not analytical. Although Liang and Shi claimed that their scheme is a closed form solution for GLRAM, our analysis indicates that it cannot get the right solutions, because the theorem, which the algorithm is derived from, is incorrect. Furthermore, the upper and lower bounds of the objective function of GLRAM are also given in this paper. For simplicity, we call their scheme the LS scheme hereafter. Incorrectness of the LS scheme is also pointed out in Refs. [3], [4], [5], where a non-iterative algorithm is proposed to find an approximate solution of GLRAM in Ref. [3].
The rest of this paper is organized as follows. We give a brief review of the LS scheme in Section 2. Our analysis is presented in Section 3. The conclusion is in Section 4.
Section snippets
Review of the SL scheme
Denote the M original image matrices as , for . Let two matrices and be with orthonormal columns. The aim is to find proper , , and M matrices , such that is a good approximation of , for . This can be formulated as follows [1]:
To solve the minimization problem, some theorems and corollaries are needed, which can be found in Refs. [1], [2]. Theorem 1 Let and be the optimal solution to the
Comments on the SL scheme
In Ref. [2], Liang and Shi claimed that their scheme is an analytical algorithm for GLRMM. However, we will show in Theorem 4 that it is not true. Theorem 4 The LS scheme is not an analytical algorithm for GLRMM. Proof To prove the scheme is not an analytical algorithm, we just need to find a counterexample for which the result is not the optimal solution of Eq. (2). Define two matrices as Let , and then according to the LS scheme, we can get
Conclusion
In this paper, we present an analysis of the analytical scheme for GLRAM proposed recently in Ref. [2]. Our results show that this scheme is incorrect, which is contrary to the original authors’ conclusion. Furthermore, the upper and lower bounds of the objective function of GLRAM are also given. We remark that it is still an open problem to construct effective and fast algorithms for low rank approximation and its applications.
Acknowledgments
The authors would like to thank the anonymous reviewers and the associate editor for their insightful comments, which led to a significantly improved presentation of the manuscript. This work was supported by the National Natural Science Foundation of China under Grant 60675002, and funded by the Basic Research Foundation of Tsinghua National Laboratory for Information Science and Technology (TNList).
References (6)
- et al.
An analytical algorithm for generalized low-rank approximations of matrices
Pattern Recognition
(2005) - et al.
Non-iterative generalized low rank approximation of matrices
Pattern Recognition Lett.
(2006) - J. Ye, Generalized low rank approximations of matrices, The 21st International Conference on Machine Learning, 2004,...
Cited by (3)
Non-iterative GLRAM algorithm for face recognition
2014, Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian UniversitySymmetric generalized low rank approximations of matrices
2012, ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - ProceedingsRejoinder
2011, Journal of the American Statistical Association