Skip to main content
Log in

Robust visual tracker combining temporal consistent constraint and adaptive spatial regularization

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Existing discriminative correlation filters suffer from the defects of potential spatial distractors and the degradation of appearance model caused by hard-temporal correlation. Aiming at this issue, a robust tracker which combines the adaptive spatial regularization and the temporal consistent constraint is proposed in this paper. First, we propose to take the extracted saliency map of the background as a reference weight to construct the spatial regularization term, with which the perceived performance of the filter against distractors is enhanced by learning the spatial sparse constraint adaptively. Second, we further implement the temporal consistent regularization formed by capturing dynamic appearance information from multiple historical frames with a high-confidence strategy to mitigate the model degradation. Third, we employ the alternating direction method of multipliers to solve the constrained optimization problem efficiently, thereby the computational complexity can be reduced. The concrete experimental results on OTB-2013, OTB-2015, Temple-Color-128 and VOT2016 benchmarks demonstrate that our tracker outperforms several state-of-the-art algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Yilmaz A, Javed O, Shah M (2006) Object tracking: a survey. ACM Comput Surv 38(4):1–45

    Article  Google Scholar 

  2. Ross DA, Lim J, Lin RS, Yang MH (2008) Incremental learning for robust visual tracking. Int J Comput Vis 77(1–3):125–141

    Article  Google Scholar 

  3. Laura SL, Erik LM (2012) Distribution fields for tracking. In: IEEE conference on computer vision and pattern recognition

  4. Kwon J, Lee KM (2010) Visual tracking decomposition. In: IEEE conference on computer vision and pattern recognition, pp 1269–1276

  5. Vojir T, Noskova J, Matas J (2013) Robust scale-adaptive mean-shift for tracking. Pattern Recogn Lett 47:652–663

    Google Scholar 

  6. Hare S, Golodetz S, Saffari A, Vineet V, Cheng MM, Hicks SL, Torr PHS (2016) Struck: structured output tracking with kernels. IEEE Trans Pattern Anal Mach Intell 38(10):2096–2109

    Article  Google Scholar 

  7. Kalal Z, Mikolajczyk K, Matas J (2012) Tracking-learning-detection. IEEE Trans Pattern Anal Mach Intell 34(7):1409–1422

    Article  Google Scholar 

  8. Bolme DS, Beveridge JR, Draper BA, Lui YM (2010) Visual object tracking using adaptive correlation filters. In: IEEE conference on computer vision and pattern recognition

  9. Henriques JF, Caseiro R, Martins P, Batista J (2012) Exploiting the circulant structure of tracking-by- detection with kernels. In: European conference on computer vision, pp 702–715

  10. Henriques JF, Caseiro R, Martins P, Batista J (2015) High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 37(3):583–596

    Article  Google Scholar 

  11. Danelljan M, Khan FS, Felsberg M, Weijer J (2014) Adaptive color attributes for real-time visual tracking. In: IEEE conference on computer vision and pattern recognition, pp 1090–1097

  12. Danelljan M, Häger G, Khan FS, Felsberg M (2016) Learning spatially regularized correlation filters for visual tracking. In: IEEE international conference on computer vision, pp 4310–4318

  13. Galoogahi HK, Fagg A, Lucey S (2017) Learning background-aware correlation filters for visual tracking. In: IEEE International conference on computer vision, pp 1144–1152

  14. Mueller M, Smith N, Ghanem B (2017) Context-aware correlation filter tracking. In: IEEE conference on computer vision and pattern recognition, pp 1387–1395

  15. Li F, Tian C, Zuo WM, Zhang L, Yang MH (2018) Learning spatial-temporal regularized correlation filters for visual tracking. In: IEEE conference on computer vision and pattern recognition, pp 4904–4913

  16. Boyd S, Parikh N, Chu E, Peleato B, Eckstein J (2011) Distributed optimization and statistical learning via the alternating direction method of multipliers. Found Trends Mach Learn 3(1):1–122

    Article  Google Scholar 

  17. Wu Y, Lim J, Yang MH (2013) Online object tracking: a benchmark. In: IEEE conference on computer vision and pattern recognition, pp 2411–2418

  18. Wu Y, Lim J, Yang MH (2015) Object tracking benchmark. IEEE Trans Pattern Anal Mach Intell 37(9):1834–1848

    Article  Google Scholar 

  19. Liang PP, Blasch E, Ling HB (2015) Encoding color information for visual tracking: algorithms and benchmark. IEEE Trans Image Process 24(12):5630–5644

    Article  MathSciNet  Google Scholar 

  20. Kristan M, Leonardis A, Matas J, Felsberg M, Pflugfelder RP, Cehovin L, Vojır T, Häger G, Lukezic A, Fern G et al (2016) The visual object tracking VOT2016 challenge results. In: European conference on computer vision

  21. Danelljan M, Häger G, Khan FS, Felsberg M (2014) Accurate scale estimation for robust visual tracking. In: British machine vision conference, pp 1–11

  22. Li Y, Zhu JK (2014) A scale adaptive kernel correlation filter tracker with feature integration. In: European conference on computer vision, pp 254–265

  23. Li F, Yao YJ, Li PH, Zhang D, Zuo WM, Yang MH (2017) Integrating boundary and center correlation filters for visual tracking with aspect ratio variation. In: IEEE international conference on computer vision

  24. Bertinetto L, Valmadre J, Golodetz S, Miksik O, Torr PHS (2016) Staple: complementary learners for real-time tracking. In: IEEE conference on computer vision and pattern recognition, pp 1401–1409

  25. Danelljan M, Robinson A, Khan FS, Felsberg M (2016) Beyond correlation filters: learning continuous convolution operators for visual tracking. In: European conference on computer vision, pp 472–488

  26. Danelljan M, Bhat G, Khan FS, Felsberg M (2017) ECO: efficient convolution operators for tracking. In: European conference on computer vision

  27. Ma C, Yang XK, Zhang CY, Yang MH (2015) Long-term Correlation Tracking. In: IEEE conference on computer vision and pattern recognition, pp 5388–5396

  28. Wang MM, Liu Y, Huang ZY (2017) Large margin object tracking with circulant feature maps. In: IEEE conference on computer vision and pattern recognition, pp 4800–4808

  29. Valmadre J, Bertinetto L, Henriques J, Vedaldi A, Torr PHS (2017) End-to-end representation learning for correlation filter based tracking. In: IEEE conference on computer vision and pattern recognition, pp 5000–5008

  30. Wang Q, Gao J, Xing JL, Zhang MD, Hu WM (2017) DCFNet: discriminant correlation filters network for visual tracking. In: IEEE conference on computer vision and pattern recognition

  31. Lukežič A, Vojir T, Zajc LČ, Matas J, Kristan M (2017) Discriminative correlation filter with channel and spatial reliability. In: IEEE conference on computer vision and pattern recognition, pp 4847–4856

  32. Bibby C, Reid I (2008) Robust real-time visual tracking using pixel-wise posteriors. In: European conference on computer vision, pp 831–844

  33. Ma C, Huang JB, Yang XK, Yang MH (2015) Hierarchical convolutional features for visual tracking. In: IEEE international conference on computer vision, pp 3074–3082

Download references

Acknowledgements

This work is supported in part by National Natural Science Foundation of China (Grant No. 61972307), the Foundation of Preliminary Research Field of China (Grant No. 61405170206) and the 13th Five-Year Equipment Development Project of China (Grant No. 41412010202).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guixi Liu.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, Y., Liu, G., Zhang, H. et al. Robust visual tracker combining temporal consistent constraint and adaptive spatial regularization. Neural Comput & Applic 33, 8355–8374 (2021). https://doi.org/10.1007/s00521-020-05589-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05589-w

Keywords

Navigation