MemBridge: Video-Language Pre-Training With Memory-Augmented Inter-Modality Bridge | IEEE Journals & Magazine | IEEE Xplore

MemBridge: Video-Language Pre-Training With Memory-Augmented Inter-Modality Bridge


Abstract:

Video-language pre-training has attracted considerable attention recently for its promising performance on various downstream tasks. Most existing methods utilize the mod...Show More

Abstract:

Video-language pre-training has attracted considerable attention recently for its promising performance on various downstream tasks. Most existing methods utilize the modality-specific or modality-joint representation architectures for the cross-modality pre-training. Different from previous methods, this paper presents a novel architecture named Memory-augmented Inter-Modality Bridge (MemBridge), which uses the learnable intermediate modality representations as the bridge for the interaction between videos and language. Specifically, in the transformer-based cross-modality encoder, we introduce the learnable bridge tokens as the interaction approach, which means the video and language tokens can only perceive information from bridge tokens and themselves. Moreover, a memory bank is proposed to store abundant modality interaction information for adaptively generating bridge tokens according to different cases, enhancing the capacity and robustness of the inter-modality bridge. Through pre-training, MemBridge explicitly models the representations for more sufficient inter-modality interaction. Comprehensive experiments show that our approach achieves competitive performance with previous methods on various downstream tasks including video-text retrieval, video captioning, and video question answering on multiple datasets, demonstrating the effectiveness of the proposed method. The code has been available at https://github.com/jahhaoyang/MemBridge.
Published in: IEEE Transactions on Image Processing ( Volume: 32)
Page(s): 4073 - 4087
Date of Publication: 12 July 2023

ISSN Information:

PubMed ID: 37436853

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.