Abstract:
With more consumers opting to purchase outfits online, a challenge facing e-commerce providers is to effectively model the compatibility of the outfits being recommended ...Show MoreMetadata
Abstract:
With more consumers opting to purchase outfits online, a challenge facing e-commerce providers is to effectively model the compatibility of the outfits being recommended to the user. Existing models mainly consider the propagation of information within visual and textual modalities of the outfit, and overlook the intrinsic compatibility and connection among items within the outfit and across multiple category views. To address these issues, we propose a Heterogeneous-Grained Multi-Modal Graph Network (HMGN) model to capture the style-compatible relationships among items with various category views. The HMGN is constructed with two graph neural networks from coarse-grained and fine-grained category views of the outfit. Additionally, we develop a mixed category granularity fusion method for the HMGN to learn interactions among items in both single-grained and cross-grained views during propagation. Thus, the HMGN achieves collaborative information propagation across different category-grained views. Comprehensive tests show that the HMGN approach outperforms the state-of-the-art methods on both the Polyvore Outfits-ND and the Polyvore Outfits-D datasets.
Published in: IEEE Transactions on Emerging Topics in Computational Intelligence ( Volume: 8, Issue: 2, April 2024)