Skip to main content

A Scalable and Feasible Matrix Completion Approach Using Random Projection

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2015)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9491))

Included in the following conference series:

  • 2716 Accesses

Abstract

The low rank matrix completion problem has attracted great attention and been widely studied in collaborative filtering and recommendation systems. The rank minimization problem is NP-hard, so the problem is usually relaxed into a matrix nuclear norm minimization. However, the usage is limited in scability due to the high computational complexity of singular value decomposition (SVD). In this paper we introduce a random projection to handle this limitation. In particular, we use a randomized SVD to accelerate the classical Soft-Impute algorithm for the matrix completion problem. The empirical results show that our approach is more efficient while achieving almost same performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Cai, J.F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  2. Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  3. Candès, E.J., Tao, T.: The power of convex relaxation: near-optimal matrix completion. IEEE Trans. Inf. Theor. 56(5), 2053–2080 (2010)

    Article  MathSciNet  Google Scholar 

  4. Clarkson, K.L., Woodruff, D.P.: Low rank approximation and regression in input sparsity time. In: Proceedings of the Forty-Fifth Annual ACM Symposium on Theory of Computing, pp. 81–90. ACM (2013)

    Google Scholar 

  5. Dudik, M., Harchaoui, Z., Malick, J., et al.: Lifted coordinate descent for learning with trace-norm regularization. In: AISTATS-Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics-2012, vol. 22, pp. 327–336 (2012)

    Google Scholar 

  6. Goldberg, K., Roeder, T., Gupta, D., Perkins, C.: Eigentaste: a constant time collaborative filtering algorithm. Inf. Retrieval 4(2), 133–151 (2001)

    Article  MATH  Google Scholar 

  7. Halko, N., Martinsson, P.G., Tropp, J.A.: Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev. 53(2), 217–288 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  8. Hastie, T., Mazumder, R., Lee, J., Zadeh, R.: Matrix completion and low-rank svd via fast alternating least squares (2014). arXiv preprint arXiv:1410.2596

  9. Ma, S., Goldfarb, D., Chen, L.: Fixed point and bregman iterative methods for matrix rank minimization. Math. Prog. 128(1–2), 321–353 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Mazumder, R., Hastie, T., Tibshirani, R.: Spectral regularization algorithms for learning large incomplete matrices. J. Mach. Learn. Res. 11, 2287–2322 (2010)

    MathSciNet  MATH  Google Scholar 

  11. Shalev-Shwartz, S., Gonen, A., Shamir, O.: Large-scale convex minimization with a low-rank constraint (2011)

    Google Scholar 

  12. Srebro, N., Rennie, J., Jaakkola, T.S.: Maximum-margin matrix factorization. In: Advances in Neural Information Processing Systems, pp. 1329–1336 (2004)

    Google Scholar 

  13. Toh, K.C., Yun, S.: An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems. Pacific J. Optim. 6(615–640), 15 (2010)

    MathSciNet  MATH  Google Scholar 

  14. Wang, Z., Lai, M.J., Lu, Z., Fan, W., Davulcu, H., Ye, J.: Rank-one matrix pursuit for matrix completion. In: Proceedings of the 31st International Conference on Machine Learning (ICML 2014), pp. 91–99 (2014)

    Google Scholar 

  15. Wen, Z., Yin, W., Zhang, Y.: Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm. Math. Prog. Comput. 4(4), 333–361 (2012)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiang Cao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Cao, X. (2015). A Scalable and Feasible Matrix Completion Approach Using Random Projection. In: Arik, S., Huang, T., Lai, W., Liu, Q. (eds) Neural Information Processing. ICONIP 2015. Lecture Notes in Computer Science(), vol 9491. Springer, Cham. https://doi.org/10.1007/978-3-319-26555-1_62

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-26555-1_62

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-26554-4

  • Online ISBN: 978-3-319-26555-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics