Skip to main content
Log in

A background-aware correlation filter with adaptive saliency-aware regularization for visual tracking

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Recently, the discriminative correlation filters (DCF)-based methods have performed excellent precision and speed in object tracking. Due to the continuous change and expansion of the search region, the problem of insufficient training samples is solved by the periodicity hypothesis, which inevitably introduces boundary effects that can lead to severe failures in the detection stage. In this paper, we firstly add a background penalty factor into the correlation filter and propose a novel spatial regularization term by using the saliency detection method. Based on the above two points, a background-aware correlation filter model with saliency-aware regularization is established. Secondly, in order to solve the model better and faster, we introduce an energy function for the solution of the spatial weight and apply the alternating direction method of multipliers (ADMM) method and deduce the closed-form solution of each subproblem of the objective function efficiently. Thirdly, we propose an adaptive updating mechanism based on the variation of target appearance and the reliability of tracking results, which can update the model online by adjusting the spatial weight distribution for precisely tracking in the spatio-temporal domain. Finally, we apply two BAASR models to estimate the position and the scale of the target, respectively. One model adopts hand-crafted features at multiple scales to select the optimal scale, while the other model predicts the optimal position by fusing hand-crafted features with deep features extracted from the trained network models. Extensive experiments are carried out on the following five datasets, OTB-2013, OTB-2015, UAV123, UAV20L, and TC128. Experimental results demonstrate that our tracker has superior robustness and performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Bao C, Wu Y, Ling H, Ji H (2012) Real time robust L1 tracker using accelerated proximal gradient approach. In: IEEE conference on computer vision and pattern recognition, pp 1830–1837

  2. Fan H, Xiang J (2016) Robust visual tracking with multitask joint dictionary learning. IEEE Trans Circuits Syst Video Technol 27(5):1018–1030

    Article  MathSciNet  Google Scholar 

  3. Tan B, Li Y, Zhao H, Li X, Ding S (2020) A novel dictionary learning method for sparse representation with nonconvex regularizations. Neurocomputing 417:128–141

    Article  Google Scholar 

  4. Zhang JM, Jin X, Sun J, Wang J, Li K (2019) Dual model learning combined with multiple feature selection for accurate visual tracking. IEEE Access 7:43956–43969

    Article  Google Scholar 

  5. Zhang JM, Wu Y, Feng WJ, Wang J (2019) Spatially attentive visual tracking using multi-model adaptive response fusion. IEEE Access 7:83873–83887

    Article  Google Scholar 

  6. Wang M, Liu Y, Huang Z (2017) Large margin object tracking with circulant feature maps. In: IEEE conference on computer vision and pattern recognition, pp 4021–4029

  7. Zhang T, Xu C, Yang MH (2017) Multi-task correlation particle filter for robust object tracking. In: IEEE conference on computer vision and pattern recognition, pp 4335–4343

  8. Song H, Zheng Y, Zhang K (2016) Robust visual tracking via self-similarity learning. Electron Lett 53(1):20–22

    Article  Google Scholar 

  9. Henriques JF, Caseiro R, Martins P, Batista J (2014) High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 37(3):583–596

    Article  Google Scholar 

  10. Danelljan M, Shahbaz Khan F, Felsberg M, Van de Weijer J (2014) Adaptive color attributes for real-time visual tracking. In: IEEE conference on computer vision and pattern recognition, pp 1090–1097

  11. Danelljan M, Häger G, Khan F, Felsberg M (2014) Accurate scale estimation for robust visual tracking. In: British machine vision conference

  12. Li Y, Zhu J (2014) A scale adaptive kernel correlation filter tracker with feature integration. In: European conference on computer vision, pp 254–265

  13. Zhang B, Li Z, Cao X, Ye Q, Chen C, Shen L (2016) Output constraint transfer for kernelized correlation filter in tracking. IEEE Trans Syst Man Cybern Syst 47(4):693–703

    Article  Google Scholar 

  14. Liu S, Wang S, Liu XY, Gandomi AH, Daneshmand M, Muhammad K, De Albuquerque VHC (2021) Human memory update strategy: a multi-layer template update mechanism for remote visual monitoring. IEEE Trans Multimed 23:2188–2198

    Article  Google Scholar 

  15. Danelljan M, Häger G, Shahbaz Khan F, Felsberg M (2015) Learning spatially regularized correlation filters for visual tracking. In: IEEE international conference on computer vision, pp 4310–4318

  16. Fan J, Song H, Zhang K, Liu Q, Lian W (2018) Complementary tracking via dual color clustering and spatio-temporal regularized correlation learning. IEEE Access 6:56526–56538

    Article  Google Scholar 

  17. Zhang Y, Gao X, Chen Z, Zhong H, Xie H, Yan C (2020) Mining spatial-temporal similarity for visual tracking. IEEE Trans Image Process 29:8107–8119

    Article  Google Scholar 

  18. Li F, Tian C, Zuo W, Zhang L, Yang MH (2018) Learning spatial-temporal regularized correlation filters for visual tracking. In: IEEE conference on computer vision and pattern recognition, pp 4904–4913

  19. Zhang Y, Liu G, Zhang H, Huang H (2021) Robust visual tracker combining temporal consistent constraint and adaptive spatial regularization. Neural Comput Appl 33:1–20

    Google Scholar 

  20. Zhang JM, Liu Y, Liu HH, Wang J, Zhang YD (2021) Distractor-aware visual tracking using hierarchical correlation filters adaptive selection. Appl Intell. https://doi.org/10.1007/s10489-021-02694-8

    Article  Google Scholar 

  21. Tang M, Feng J (2015) Multi-kernel correlation filter for visual tracking. In: IEEE international conference on computer vision, pp 3038–3046

  22. Li X, Huang L, Wei Z, Nie J, Chen Z (2021) Adaptive multi-branch correlation filters for robust visual tracking. Neural Comput Appl 33(7):2889–2904

    Article  Google Scholar 

  23. Zhang JM, Liu Y, Liu HH, Wang J (2021) Learning local–global multiple correlation filters for robust visual tracking with Kalman filter redetection. Sensors 21(4):1129

    Article  Google Scholar 

  24. Liu S, Wang S, Liu XY, Lin CT, Lv Z (2021) Fuzzy detection aided real-time and robust visual tracking under complex environments. IEEE Trans Fuzzy Syst 29(1):90–102

    Article  Google Scholar 

  25. Zhang JM, Sun J, Wang J, Yue XG (2021) Visual object tracking based on residual network and cascaded correlation filters. J Ambient Intell Humaniz Comput 12:8427–8440

    Article  Google Scholar 

  26. Zhang JM, Xie ZP, Sun J, Zou X, Wang J (2020) A cascaded R-CNN with multiscale attention and imbalanced samples for traffic sign detection. IEEE Access 8:29742–29754

    Article  Google Scholar 

  27. Zhang JM, Jin X, Sun J, Wang J, Sangaiah AK (2020) Spatial and semantic convolutional features for robust visual object tracking. Multimed Tools Appl 79(21):15095–15115

    Article  Google Scholar 

  28. Wu Y, Lin J, Yang MH (2013) Online object tracking: a benchmark. In: IEEE conference on computer vision and pattern recognition, pp 2411–2418

  29. Fan H, Lin L, Yang F, Chu P, Deng G, Yu S (2019) LaSOT: a high-quality benchmark for large-scale single object tracking. In: IEEE conference on computer vision and pattern recognition, pp 5369–5378

  30. Mueller M, Smith N, Ghanem B (2016) A benchmark and simulator for UAV tracking. In: European conference on computer vision, pp 445–461

  31. Liang P, Blasch E, Ling H (2015) Encoding color information for visual tracking: algorithms and benchmark. IEEE Trans Image Process 24(12):5630–5644

    Article  MathSciNet  Google Scholar 

  32. Henriques JF, Caseiro R, Martins P, Batista J (2012) Exploiting the circulant structure of tracking-by-detection with kernels. In: European conference on computer vision, pp 702–715

  33. Kiani Galoogahi H, Fagg A, Lucey S (2017) Learning background-aware correlation filters for visual tracking. In: IEEE international conference on computer vision, pp 1135–1143

  34. Bolme DS, Beveridge JR, Draper BA, Lui YM (2010) Visual object tracking using adaptive correlation filters. In: IEEE computer society conference on computer vision and pattern recognition, pp 2544–2550

  35. Ma C, Huang JB, Yang X, Yang MH (2015) Hierarchical convolutional features for visual tracking. In: IEEE international conference on computer vision, pp 3074–3082

  36. Danelljan M, Robinson A, Khan FS, Felsberg M (2016) Beyond correlation filters: learning continuous convolution operators for visual tracking. In: European conference on computer vision, pp 472–488

  37. Danelljan M, Bhat G, Shahbaz Khan F, Felsberg M (2017) ECO: efficient convolution operators for tracking. In: IEEE conference on computer vision and pattern recognition, pp 6638–6646

  38. Cui Z, Xiao S, Feng J (2016) Recurrently target-attending tracking. In: IEEE conference on computer vision and pattern recognition, pp 1449–1458

  39. Bertinetto L, Valmadre J, Henriques JF, Vedaldi A, Torr PH (2016) Fully-convolutional Siamese networks for object tracking. In: European conference on computer vision, pp 850–865

  40. Li B, Yan J, Wu W, Zhu Z, Hu X (2018) High performance visual tracking with Siamese region proposal network. In: IEEE conference on computer vision and pattern recognition, pp 8971–8980

  41. Li B, Wu W, Wang Q, Zhang F, Xing J, Yan J (2019) SiamRPN++: evolution of Siamese visual tracking with very deep networks. In: IEEE conference on computer vision and pattern recognition, pp 4282–4291

  42. Guo DY, Wang J, Cui Y, Wang ZH, Chen SY (2020) SiamCAR: Siamese fully convolutional classification and regression for visual tracking. In: IEEE/CVF conference on computer vision and pattern recognition, pp 6269–6277

  43. Yang K, He ZY, Pei WJ, Zhou ZK (2021) SiamCorners: Siamese corner networks for visual tracking. IEEE Trans Multimed. https://doi.org/10.1109/TMM.2021.3074239

    Article  Google Scholar 

  44. Qin Y, Lu H, Xu Y, Wang H (2015) Saliency detection via cellular automata. In: IEEE conference on computer vision and pattern recognition, pp 110–119

  45. Bertinetto L, Valmadre J, Golodetz S, Miksik O, Torr PH (2016) Staple: complementary learners for real-time tracking. In: IEEE conference on computer vision and pattern recognition, pp 1401–1409

  46. Qi Y, Zhang S, Qin L, Yao H, Huang Q, Lim J, Yang MH (2016) Hedged deep tracking. In: IEEE conference on computer vision and pattern recognition, pp 4303–4311

  47. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2015) Convolutional features for correlation filter based visual tracking. In: IEEE international conference on computer vision workshop, pp 58–66

  48. Huang Z, Fu C, Li Y, Lin F, Lu P (2019) Learning aberrance repressed correlation filters for real-time UAV tracking. In: IEEE international conference on computer vision, pp 2891–2900

  49. Lukežic A, Vojír T, Zajc LC, Matas J, Kristan M (2017) Discriminative correlation filter with channel and spatial reliability. In: IEEE conference on computer vision and pattern recognition, pp 6309–6318

  50. Wang N, Zhou W, Tian Q, Hong R, Wang M, Li H (2018) Multi-cue correlation filters for robust visual tracking. In: IEEE conference on computer vision and pattern recognition, pp 4844–4853

  51. Li Y, Fu C, Ding F, Huang Z, Lu G (2020) AutoTrack: towards high-performance visual tracking for UAV with automatic spatio-temporal regularization. In: IEEE conference on computer vision and pattern recognition, pp 11923–11932

Download references

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China under Grant 61972056, in part by the Basic Research Fund of Zhongye Changtian International Engineering Co., Ltd. under Grant 2020JCYJ07, in part by the Research Fund of Changsha New Smart City Research Association under Grant 2020YB006, in part by the "Double First-class" International Cooperation and Development Scientific Research Project of Changsha University of Science and Technology under Grant 2019IC34, in part by the Postgraduate Training Innovation Base Construction Project of Hunan Province under Grant 2019-248-51, and in part by the Postgraduate Scientific Research Innovation Fund of Changsha University of Science and Technology under Grant CX2021SS70.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jianming Zhang.

Ethics declarations

Conflict of interest

We declare that we have no conflicts of interest to this work.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, J., Yuan, T., He, Y. et al. A background-aware correlation filter with adaptive saliency-aware regularization for visual tracking. Neural Comput & Applic 34, 6359–6376 (2022). https://doi.org/10.1007/s00521-021-06771-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06771-4

Keywords

Navigation