Loading [a11y]/accessibility-menu.js
Hierarchical Multimodal Graph Learning for Outfit Compatibility Modelling | IEEE Journals & Magazine | IEEE Xplore

Hierarchical Multimodal Graph Learning for Outfit Compatibility Modelling


Abstract:

Outfit compatibility modelling plays a significant role in e-commerce decision-making, but the existing methods are restricted to modelling the visual and textual informa...Show More

Abstract:

Outfit compatibility modelling plays a significant role in e-commerce decision-making, but the existing methods are restricted to modelling the visual and textual information and have neglected the direct contribution of category labels and the differences in semantic richness among different modalities. This paper addresses these issues by developing a hierarchical multimodal graph learning framework for outfit compatibility modelling called HMGL-OCM, which consists of an item-level graph network and a modality-level graph network. The former augments local information of the modal features within an item, and the latter performs global learning of interactions between different items within the modality. Further, a novel cross-modality propagation method at the modality-level graph network stage is developed, which leverages features from other modality networks for complementary modelling while retaining the current modality information density of the item. Extensive experimentation on three real-world fashion datasets corroborates that the HMGL-OCM model surpasses state-of-the-art methodologies.
Page(s): 4130 - 4142
Date of Publication: 17 April 2024
Electronic ISSN: 2471-285X

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.