Loading [a11y]/accessibility-menu.js
Shared-Private Memory Networks For Multimodal Sentiment Analysis | IEEE Journals & Magazine | IEEE Xplore

Shared-Private Memory Networks For Multimodal Sentiment Analysis


Abstract:

Text, visual, and acoustic are usually complementary in the Multimodal Sentiment Analysis (MSA) task. However, current methods primarily concern shared representations wh...Show More

Abstract:

Text, visual, and acoustic are usually complementary in the Multimodal Sentiment Analysis (MSA) task. However, current methods primarily concern shared representations while neglecting the critical private aspects of data within individual modalities. In this work, we propose shared-private memory networks based on the recent advances in the attention mechanism, called SPMN, to decouple multimodal representation from shared and private perspectives. It contains three components: a) a shared memory to learn the shared representations of multimodal data; b) three private memories to learn the private representations of individual modalities, respectively; c) and adaptive fusion gates to fuse multimodal private and shared representations. To evaluate the effectiveness of SPMN, we integrate it into different pre-trained language representation models, such as BERT and XLNET, and conduct experiments on two public datasets, CMU-MOSI and CMU-MOSEI. Experimental results indicate that the performances of pre-trained language representation models are significantly improved because of SPMN and demonstrate the superiority of our model compared to the state-of-the-art methods. SPMN's source code is publicly available at: https://github.com/xiaobaicaihhh/SPMN.
Published in: IEEE Transactions on Affective Computing ( Volume: 14, Issue: 4, 01 Oct.-Dec. 2023)
Page(s): 2889 - 2900
Date of Publication: 14 November 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.