Abstract
Minimizing the rank of a matrix subject to constraints is a challenging problem that arises in many applications in machine learning, control theory, and discrete geometry. This class of optimization problems, known as rank minimization, is NP-hard, and for most practical problems there are no efficient algorithms that yield exact solutions. A popular heuristic replaces the rank function with the nuclear norm—equal to the sum of the singular values—of the decision variable and has been shown to provide the optimal low rank solution in a variety of scenarios. In this paper, we assess the practical performance of this heuristic for finding the minimum rank matrix subject to linear equality constraints. We characterize properties of the null space of the linear operator defining the constraint set that are necessary and sufficient for the heuristic to succeed. We then analyze linear constraints sampled uniformly at random, and obtain dimension-free bounds under which our null space properties hold almost surely as the matrix dimensions tend to infinity. Finally, we provide empirical evidence that these probabilistic bounds provide accurate predictions of the heuristic’s performance in non-asymptotic scenarios.
Similar content being viewed by others
References
Ames, B.P.W., Vavasis, S.A.: Nuclear norm minimization for the planted clique and biclique problems (2009). Submitted to Mathematical Programming. Preprint available at http://arxiv.org/abs/0901.3348v1
Amit, Y., Fink, M., Srebro, N., Ullman, S.: Uncovering shared structures in multiclass classification. In: Proceedings of the International Conference of Machine Learning (2007)
Argyriou, A., Micchelli, C.A., Pontil, M.: Convex multi-task feature learning. Machine Learning (2008). Published online first at http://www.springerlink.com/
Bai Z.D.: Methodologies in spectral analysis of large dimensional random matrices. Statistica Sinica 9(3), 611–661 (1999)
Baraniuk, R., Davenport, M., DeVore, R., Wakin, M.: A simple proof of the restricted isometry property for random matrices. Constructive Approximation (2008). To Appear. Preprint available at http://dsp.rice.edu/cs/jlcs-v03.pdf
Beck, C., D’Andrea, R.: Computational study and comparisons of LFT reducibility methods. In: Proceedings of the American Control Conference (1998)
Cai J.F., Candès E.J., Shen Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2008)
Candès E., Recht B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)
Candès E.J., Romberg J., Tao T.: Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52(2), 489–509 (2006)
Candès E.J., Tao T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)
Donoho D.: High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension. Discret. Comput. Geom. 35(4), 617–652 (2006)
Donoho D., Huo X.: Uncertainty principles and ideal atomic decomposition. IEEE Trans. Inf. Theory 47(7), 2845–2862 (2001)
Donoho D.L., Tanner J.: Neighborliness of randomly projected simplices in high dimensions. Proc. Natl. Acad. Sci. USA 102(27), 9452–9457 (2005)
Donoho D.L., Tanner J.: Sparse nonnegative solution of underdetermined linear equations by linear programming. Proc. Natl. Acad. Sci. USA 102(27), 9446–9451 (2005)
Fazel, M.: Matrix Rank Minimization with Applications. Ph.D. thesis, Stanford University (2002)
Fazel, M., Hindi, H., Boyd, S.: A rank minimization heuristic with application to minimum order system approximation. In: Proceedings of the American Control Conference (2001)
El Ghaoui, L., Gahinet, P.: Rank minimization under LMI constraints: a framework for output feedback problems. In: Proceedings of the European Control Conference (1993)
Gordon Y.: Some inequalities for Gaussian processes and applications. Israel J. Math. 50, 265–289 (1985)
Gordon Y.: Gaussian processes and almost spherical sections of convex bodies. Ann. Probab. 16, 180–188 (1988)
Ledoux M., Talagrand M.: Probability in Banach Spaces. Springer, Berlin (1991)
Lee, K., Bresler, Y.: Efficient and guaranteed rank minimization by atomic decomposition. In: IEEE International Symposium on Information Theory (2009)
Linial N., London E., Rabinovich Y.: The geometry of graphs and some of its algorithmic applications. Combinatorica 15, 215–245 (1995)
Liu Z., Vandenberghe L.: Interior-point method for nuclear norm approximation with application to system identification. SIAM J. Matrix Anal. Appl. 31(3), 1235–1256 (2009)
Ma, S., Goldfarb, D., Chen, L.: Fixed point and Bregman iterative methods for matrix rank minimization (2008). Preprint available at http://www.optimization-online.org/DB_HTML/2008/11/2151.html
Marčenko V.A., Pastur L.A.: Distributions of eigenvalues for some sets of random matrices. Math. USSR-Sbornik 1, 457–483 (1967)
Meka, R., Jain, P., Caramanis, C., Dhillon, I.S.: Rank minimization via online learning. In: Proceedings of the International Conference on Machine Learning (2008)
Mesbahi M., Papavassilopoulos G.P.: On the rank minimization problem over a positive semidefinite linear matrix inequality. IEEE Trans. Autom. Control 42(2), 239–243 (1997)
Parrilo P.A., Khatri S.: On cone-invariant linear matrix inequalities. IEEE Trans. Automat. Control 45(8), 1558–1563 (2000)
Recht B., Fazel M., Parrilo P.: Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)
Recht, B., Xu, W., Hassibi, B.: Necessary and sufficient conditions for success of the nuclear norm heuristic for rank minimization. In: Proceedings of the 47th IEEE Conference on Decision and Control (2008)
Rennie, J.D.M., Srebro, N.: Fast maximum margin matrix factorization for collaborative prediction. In: Proceedings of the International Conference of Machine Learning (2005)
Slepian D.: The one-sided barrier problem for Gaussian noise. Bell Syst. Tech. J. 41, 463–501 (1962)
Stojnic, M., Xu, W., Hassibi, B.: Compressed sensing - probabilistic analysis of a null-space characterization. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (2008)
Sturm J.F.: Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones. Optim. Methods Softw. 11–12, 625–653 (1999)
Szarek, S.J.: Metric entropy of homogeneous spaces. In: Quantum probability (Gdańsk, 1997), Banach Center Publ., vol. 43, pp. 395–410. Polish Acad. Sci., Warsaw (1998). Preprint available at arXiv:math/ 9701213v1
Weinberger K.Q., Saul L.K.: Unsupervised learning of image manifolds by semidefinite programming. Int. J. Comput. Vis. 70(1), 77–90 (2006)
Yuan M., Ekici A., Lu Z., Monteiro R.: Dimension reduction and coefficient estimation in multivariate linear regression. J. Roy. Stat. Soc. Ser. B 69, 329–346 (2007)
Zhang, Y.: A simple proof for recoverability of ℓ 1 minimization: go over or under? Tech. Rep. TR05-09, Rice CAAM Department (2005)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Recht, B., Xu, W. & Hassibi, B. Null space conditions and thresholds for rank minimization. Math. Program. 127, 175–202 (2011). https://doi.org/10.1007/s10107-010-0422-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-010-0422-2