Abstract
The low rank matrix completion problem has attracted great attention and been widely studied in collaborative filtering and recommendation systems. The rank minimization problem is NP-hard, so the problem is usually relaxed into a matrix nuclear norm minimization. However, the usage is limited in scability due to the high computational complexity of singular value decomposition (SVD). In this paper we introduce a random projection to handle this limitation. In particular, we use a randomized SVD to accelerate the classical Soft-Impute algorithm for the matrix completion problem. The empirical results show that our approach is more efficient while achieving almost same performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Cai, J.F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)
Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)
Candès, E.J., Tao, T.: The power of convex relaxation: near-optimal matrix completion. IEEE Trans. Inf. Theor. 56(5), 2053–2080 (2010)
Clarkson, K.L., Woodruff, D.P.: Low rank approximation and regression in input sparsity time. In: Proceedings of the Forty-Fifth Annual ACM Symposium on Theory of Computing, pp. 81–90. ACM (2013)
Dudik, M., Harchaoui, Z., Malick, J., et al.: Lifted coordinate descent for learning with trace-norm regularization. In: AISTATS-Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics-2012, vol. 22, pp. 327–336 (2012)
Goldberg, K., Roeder, T., Gupta, D., Perkins, C.: Eigentaste: a constant time collaborative filtering algorithm. Inf. Retrieval 4(2), 133–151 (2001)
Halko, N., Martinsson, P.G., Tropp, J.A.: Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev. 53(2), 217–288 (2011)
Hastie, T., Mazumder, R., Lee, J., Zadeh, R.: Matrix completion and low-rank svd via fast alternating least squares (2014). arXiv preprint arXiv:1410.2596
Ma, S., Goldfarb, D., Chen, L.: Fixed point and bregman iterative methods for matrix rank minimization. Math. Prog. 128(1–2), 321–353 (2011)
Mazumder, R., Hastie, T., Tibshirani, R.: Spectral regularization algorithms for learning large incomplete matrices. J. Mach. Learn. Res. 11, 2287–2322 (2010)
Shalev-Shwartz, S., Gonen, A., Shamir, O.: Large-scale convex minimization with a low-rank constraint (2011)
Srebro, N., Rennie, J., Jaakkola, T.S.: Maximum-margin matrix factorization. In: Advances in Neural Information Processing Systems, pp. 1329–1336 (2004)
Toh, K.C., Yun, S.: An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems. Pacific J. Optim. 6(615–640), 15 (2010)
Wang, Z., Lai, M.J., Lu, Z., Fan, W., Davulcu, H., Ye, J.: Rank-one matrix pursuit for matrix completion. In: Proceedings of the 31st International Conference on Machine Learning (ICML 2014), pp. 91–99 (2014)
Wen, Z., Yin, W., Zhang, Y.: Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm. Math. Prog. Comput. 4(4), 333–361 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Cao, X. (2015). A Scalable and Feasible Matrix Completion Approach Using Random Projection. In: Arik, S., Huang, T., Lai, W., Liu, Q. (eds) Neural Information Processing. ICONIP 2015. Lecture Notes in Computer Science(), vol 9491. Springer, Cham. https://doi.org/10.1007/978-3-319-26555-1_62
Download citation
DOI: https://doi.org/10.1007/978-3-319-26555-1_62
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-26554-4
Online ISBN: 978-3-319-26555-1
eBook Packages: Computer ScienceComputer Science (R0)