Skip to main content
Log in

Quality improvement of motion-compensated frame interpolation by self-similarity based context feature

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Block Matching Algorithm (BMA) is the core of Motion-Compensated Frame Interpolation (MCFI), and its accuracy greatly affects the interpolation quality of MCFI. To improve BMA accuracy, this paper proposes the use of a self-similarity based context feature to improve the matching accuracy of BMA. First, we extract the patch centered at any pixel in a block, and perform the self-similarity descriptor to generate its correlation surface. Second, the correlation surface is statistically measured to represent the context feature, and the context cube of a block is produced by attaching the corresponding context feature to each pixel. Finally, we fuse the context cube into bidirectional matching criterion of BMA to get the motion vector field of the absent frame, and predict the absent frame by using motion compensation interpolation. Experimental results show that the proposed algorithm improves the BMA accuracy with a low computational complexity, and is better than the traditional MCFI algorithms in terms of both objective and subjective quality of the interpolated frames.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  1. Vranješ D, Rimac-Drlje S, Vranješ M (2020) Adaptive temporal frame interpolation algorithm for frame rate up-conversion. IEEE Consumer Electronics Magazine 9(3):17–21. https://doi.org/10.1109/MCE.2019.2956208

    Article  Google Scholar 

  2. He J, Yang G, Liu X et al (2020) Spatio-temporal saliency-based motion vector refinement for frame rate up-conversion[J]. ACM Trans Multimedia Comput, Commun Appl (TOMM) 16(2):1–18

    Article  Google Scholar 

  3. Vanam R, Reznik YA (2020) Frame rate up-conversion using bi-directional optical flows with dual regularization. 2020 IEEE Int Conf Image Process (ICIP):558–562. https://doi.org/10.1109/ICIP40778.2020.9191325

  4. Van Thang, Lee K, Lee H-J (2020) A stacked deep MEMC network for frame rate up conversion and its application to HEVC. IEEE Access 8:58310–58321. https://doi.org/10.1109/ACCESS.2020.2982039

    Article  Google Scholar 

  5. Lee HS and Cho SI, "Luminance Level of Histogram-Based Scenechange Detection for Frame Rate Up-Conversion," in IEEE Access, https://doi.org/10.1109/ACCESS.2022.3146645.

  6. Wang H, … Zheng N (2020) A 4K×2K@60fps multifunctional video display processor for high perceptual image quality. IEEE Trans Circ Syst I: Regular Papers 67(2):451–463. https://doi.org/10.1109/TCSI.2019.2921943

    Article  Google Scholar 

  7. Zhu Q, Han C, Han G, Wong T-T, He S (2021) Video Snapshot: Single Image Motion Expansion via Invertible Motion Embedding. IEEE Trans Patt Anal Mach Intell 43(12):4491–4504. https://doi.org/10.1109/TPAMI.2020.3001644

    Article  Google Scholar 

  8. Yuan X, Brady DJ, Katsaggelos AK (2021) Snapshot compressive imaging: theory, algorithms, and applications. IEEE Signal Process Magaz 38(2):65–88. https://doi.org/10.1109/MSP.2020.3023869

    Article  Google Scholar 

  9. He Y, Bu X, Jiang M, Fan M (2020) Low bit rate underwater video image compression and coding method based on wavelet decomposition. China Commun 17(9):210–219. https://doi.org/10.23919/JCC.2020.09.016

    Article  Google Scholar 

  10. Xue F, Li J, Liu J, and Wu C (2021) "BWIN: A Bilateral Warping Method for Video Frame Interpolation," in 2021 IEEE International Conference on Multimedia and Expo (ICME), pp. 1–6

  11. Wijma R, You S, and Li Y , (2021) "Multi-Level Adaptive Separable Convolution for Large-Motion Video Frame Interpolation," in 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW) pp. 1127–1135.

  12. Suzuki K and Ikehara M (2021) "Residual Learning of Video Frame Interpolation Using Convolutional LSTM". 2020 25th Int Conf Patt Recogn (ICPR), 1499–1504.

  13. Kokaram A, Singh D, Robinson S (2020) Twenty years of Frame Interpolation for Retiming in the Movies. SMPTE 2020 Annual Techn Conf Exhib:1–20. https://doi.org/10.5594/M001927

  14. Bao W, Lai WS, Zhang X, Gao Z, Yang MH (2021) MEMC-net: motion estimation and motion compensation driven neural network for video interpolation and enhancement. IEEE Trans Pattern Anal Mach Intell 43(3):933–948

    Article  Google Scholar 

  15. Paikin G, Ater Y, Shaul R, and Soloveichik E (2021) "EFI-Net: Video Frame Interpolation from Fusion of Events and Frames," in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). pp. 1291–1301.

  16. Niklaus S, Mai L, and Wang O (2021) "Revisiting Adaptive Convolutions for Video Frame Interpolation," in 2021 IEEE Winter Conference on Applications of Computer Vision (WACV). 1098–1108.

  17. Shen W, Bao W, Zhai G, Chen L, Min X, and Gao Z (2020) "Blurry Video Frame Interpolation," in 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 5113–5122

  18. Bao W, Zhang X, Chen L, Ding L, Gao Z (2018) High-order model and dynamic filtering for frame rate up-conversion. IEEE Trans Image Process 27(8):3813–3826. https://doi.org/10.1109/TIP.2018.2825100

    Article  MathSciNet  MATH  Google Scholar 

  19. Kokaram A, Singh D and Robinson S (2020) A Bayesian View of Frame Interpolation and a Comparison with Existing Motion Picture Effects Tools," 2020 IEEE International Conference on Image Processing (ICIP), pp. 553–557, https://doi.org/10.1109/ICIP40778.2020.9191152.

  20. Choi G, Heo P, Oh SR, and Park H (2017) "A new motion estimation method for motion-compensated frame interpolation using a convolutional neural network," in 2017 IEEE International Conference on Image Processing (ICIP). 800–804.

  21. Choi G, Heo P, Park H (2019) Triple-frame-based bi-directional motion estimation for motion-compensated frame interpolation. IEEE Trans Circ Syst Video Technol 29(5):1251–1258

    Article  Google Scholar 

  22. Zhang Y, Chen L, Yan C, Qin P, Ji X, Dai Q (2020) Weighted convolutional motion-compensated frame rate up-conversion using deep residual network. IEEE Trans Circ Syst Video Technol 30(1):11–22

    Article  Google Scholar 

  23. Tsai T, Lin H (2012) High visual quality particle based frame rate up conversion with acceleration assisted motion trajectory calibration. J Display Technol 8(6):341–351. https://doi.org/10.1109/JDT.2012.2186555

    Article  Google Scholar 

  24. Yoon S, Kim H, Kim M (2018) Hierarchical extended bilateral motion estimation-based frame rate Upconversion using learning-based linear mapping. IEEE Trans Image Process 27(12):5918–5932

    Article  MathSciNet  Google Scholar 

  25. Jeong S, Lee C, Kim C (2012) Exemplar-based frame rate up-conversion with congruent segmentation. 2012 19th IEEE Int Conf Image Process:845–848. https://doi.org/10.1109/ICIP.2012.6466992

  26. Wang C, Zhang L, He Y, Tan Y (2010) Frame rate up-conversion using trilateral filtering. IEEE Trans Circ Syst Video Technol 20(6):886–893. https://doi.org/10.1109/TCSVT.2010.2046057

    Article  Google Scholar 

  27. Kim D, Lim H, Park H (2013) Iterative true motion estimation for motion-compensated frame interpolation. IEEE Trans Circ Syst Video Technol 23(3):445–454. https://doi.org/10.1109/TCSVT.2012.2207271

    Article  Google Scholar 

  28. Zhao Y, Ge G and Sun Q (2019) "Frame Rate Up-Conversion Based on Edge Information". 2019 7th international conference on information, communication and networks (ICICN), Macao, Macao,, 158–162, https://doi.org/10.1109/ICICN.2019.8834946.

  29. Jeong S, Lee C, Kim C (2013) Motion-compensated frame interpolation based on multihypothesis motion estimation and texture optimization. IEEE Trans Image Process 22(11):4497–4509. https://doi.org/10.1109/TIP.2013.2274731

    Article  MathSciNet  MATH  Google Scholar 

  30. Jacobson N, Lee Y, Mahadevan V, Vasconcelos N, Nguyen TQ (2010) A novel approach to FRUC using discriminant saliency and frame segmentation. IEEE Trans Image Process 19(11):2924–2934. https://doi.org/10.1109/TIP.2010.2050928

    Article  MathSciNet  MATH  Google Scholar 

  31. Romano Y, Elad M (2016) Con-patch: when a patch meets its context. IEEE Trans Image Process 25(9):3967–3978. https://doi.org/10.1109/TIP.2016.2576402

    Article  MathSciNet  MATH  Google Scholar 

  32. Shechtman E and Irani M (2007) "Matching Local Self-Similarities across Images and Videos," in 2007 IEEE Conference on Computer Vision and Pattern Recogn, 1–8.

Download references

Acknowledgements

This work was funded in part by the Project of Science and Technology Department of Henan Province in China, under Grant no. 212102210106, in part by the National Natural Science Foundation of China, under Grant nos. 61572417, 31872704, in part by Innovation Team Support Plan of University Science and Technology of Henan Province in China, under Grant no. 19IRTSTHN014, and in part by the Guangxi Key Laboratory of Wireless Wideband Communication and Signal Processing and China Ministry of Education Key Laboratory of Cognitive Radio and Information Processing.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ran Li.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, R., Hao, P., Sun, F. et al. Quality improvement of motion-compensated frame interpolation by self-similarity based context feature. Multimed Tools Appl 81, 24301–24318 (2022). https://doi.org/10.1007/s11042-022-12814-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12814-2

Keywords

Navigation