Abstract:
Predicting what items a user will consume in the next time (i.e., next-item recommendation) is a crucial task for recommender systems. While the factorization method is a...Show MoreMetadata
Abstract:
Predicting what items a user will consume in the next time (i.e., next-item recommendation) is a crucial task for recommender systems. While the factorization method is a popular choice in recommendation, several recent efforts have shown that the inner product does not satisfy the triangle inequality, which may hurt the model's generalization ability. TransRec is a promising method to overcome this issue, which learns a distance metric to predict the strength of user-item interactions. Nevertheless, such method only uses the latest consumed item to model a user's short-term preference, which is insufficient for modeling fidelity. In this article, we propose a simple yet effective method named attentive translation model, to explicitly exploit high-order sequential information for next-item recommendation. Specifically, we construct a user-specific translation vector by accounting for multiple recent items, which encode more information about a user's short-term preference than the latest item. To aggregate multiple items into one representation, we devise a position-aware attention mechanism, learning different weights on items at different orders in a personalized way. Extensive experiments on four real-world datasets show that our method significantly outperforms several state-of-the-art methods.
Published in: IEEE Transactions on Industrial Informatics ( Volume: 16, Issue: 3, March 2020)