Bi-Directional Pseudo-Three-Dimensional Network for Video Frame Interpolation | IEEE Journals & Magazine | IEEE Xplore

Bi-Directional Pseudo-Three-Dimensional Network for Video Frame Interpolation


Abstract:

Recent video frame interpolation methods have employed the curvilinear motion model to accommodate nonlinear motion among frames. The effectiveness of such model often hi...Show More

Abstract:

Recent video frame interpolation methods have employed the curvilinear motion model to accommodate nonlinear motion among frames. The effectiveness of such model often hinges on motion estimation and occlusion detection, and therefore is greatly challenged when these methods are used to handle dynamic scenes that contain complex motions and occlusions. We address the challenges by proposing a bi-directional pseudo-three-dimensional network to exploit the correlation between motion estimation and depth-related occlusion estimation that considers the third dimension: depth. Specifically, the network exploits the correlation by learning shared multi-scale spatiotemporal representations, and by coupling the estimations, in both the past and future directions, to synthesize intermediate frames through a bi-directional pseudo-three-dimensional warping layer, where adaptive convolution kernels are estimated progressively from the coalescence of motion and depth-related occlusion estimations across multiple scales to acquire nonlocal and adaptive neighborhoods. The proposed network utilizes a novel multi-task collaborative learning strategy, which facilitates the supervised learning of video frame interpolation using complementary self-supervisory signals from motion and depth-related occlusion estimations. Across various benchmark datasets, the proposed method outperforms state-of-the-art methods in terms of accuracy, model size and runtime performance.
Published in: IEEE Transactions on Image Processing ( Volume: 31)
Page(s): 6773 - 6788
Date of Publication: 25 October 2022

ISSN Information:

PubMed ID: 36282822

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.