Distributed Bayesian Probabilistic Matrix Factorization

https://doi.org/10.1016/j.procs.2017.05.009Get rights and content
Under a Creative Commons license
open access

Abstract

Using the matrix factorization technique in machine learning is very common mainly in areas like rec-ommender systems. Despite its high prediction accuracy and its ability to avoid over-fitting of the data, the Bayesian Probabilistic Matrix Factorization algorithm (BPMF) has not been widely used on large scale data because of the prohibitive cost. In this paper, we propose a distributed high-performance parallel implementation of the BPMF using Gibbs sampling on shared and distributed architectures. We show by using efficient load balancing using work stealing on a single node, and by using asynchronous communication in the distributed version we beat state of the art implementations.

Keywords

Probabilistic matrix factorization algorithm
Collaborative filtering
Machine learning
PGAS
multi-core

Cited by (0)