Abstract
Stochastic gradient descent (SGD) is an effective algorithm to solve matrix factorization problem. However, the performance of SGD depends critically on how learning rates are tuned over time. In this paper, we propose a novel per-dimension learning rate schedule called AALRSMF. This schedule relies on local gradients, requires no manual tunning of a global learning rate, and shows to be robust to the selection of hyper-parameters. The extensive experiments demonstrate that the proposed schedule shows promising results compared to existing ones on matrix factorization.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chin, W.-S., Zhuang, Y., Juan, Y.-C., Lin, C.-J.: A learning-rate schedule for stochastic gradient methods to matrix factorization. In: Cao, T., Lim, E.-P., Zhou, Z.-H., Ho, T.-B., Cheung, D., Motoda, H. (eds.) PAKDD 2015. LNCS, vol. 9077, pp. 442–455. Springer, Heidelberg (2015)
Gemulla, R., Nijkamp, E., Haas, P.J., Sismanis, Y.: Large-scale matrix factorization with distributed stochastic gradient descent. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 69–77. ACM (2011)
Koren, Y., Bell, R., Volinsky, C., et al.: Matrix factorization techniques for recommender systems. Computer 42(8), 30–37 (2009)
Yun, H., Yu, H.F., Hsieh, C.J., Vishwanathan, S., Dhillon, I.: Nomad: non-locking, stochastic multi-machine algorithm for asynchronous and decentralized matrix completion. Proc. VLDB Endowment 7(11), 975–986 (2014)
Zeiler, M.D.: Adadelta: an adaptive learning rate method. arXiv preprint arXiv:1212.5701 (2012)
Acknowledgement
The numerical calculations in this paper have been done on the supercomputing system in the Supercomputing Center of University of Science and Technology of China.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Wei, F., Guo, H., Cheng, S., Jiang, F. (2016). AALRSMF: An Adaptive Learning Rate Schedule for Matrix Factorization. In: Li, F., Shim, K., Zheng, K., Liu, G. (eds) Web Technologies and Applications. APWeb 2016. Lecture Notes in Computer Science(), vol 9932. Springer, Cham. https://doi.org/10.1007/978-3-319-45817-5_36
Download citation
DOI: https://doi.org/10.1007/978-3-319-45817-5_36
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-45816-8
Online ISBN: 978-3-319-45817-5
eBook Packages: Computer ScienceComputer Science (R0)