Skip to main content
Log in

Context and saliency aware correlation filter for visual tracking

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Visual tracking in complex scenarios is a big challenge in the computer vision community. Due to correlation filter (CF) recently have achieved excellent results both on accuracy and robustness in visual tracking, many researchers have focused on incorporating different features for better represent the tracking target. However, CF-based trackers have poor ability to handle problem in many complex scenes with challenges like deformation, motion blur and background clutters. To overcome these defects, we propose a context and saliency aware CF for visual tracking (CSCF). Context information around the target of interest is introduced into correlation filters to strengthen the discriminative ability of CF, which can reduce the boundary effect and the influence of the background. Then the saliency feature map of the target is combined with CF to strengthen the ability to extract targets of interest from complex background. Experimental results show that the proposed method shows competitive performance on OTB dataset and UAV dataset compared to several other CF trackers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Availability of data and material

All the data used to support the findings of this study are included within the article.

References

  1. Abbass MY, Kwon KC, Kim N, Abdelwahab SA, El-Samie FEA, Khalaf AA (2021) A survey on online learning for visual tracking. Vis Comput 37:993–1014

    Article  Google Scholar 

  2. Chen Z, Zhong B, Li G, Zhang S, Ji R (2020) Siamese box adaptive network for visual tracking. In: IEEE/CVF conference on computer vision and pattern recognition. IEEE/CVF, Seattle, USA, pp 6668–6677

  3. Dai K, Wang D, Lu H, Sun C, Li J (2019) Visual tracking via adaptive spatially-regularized correlation filters. In: IEEE/CVF conference on computer vision and pattern recognition, pp 4670–4679

  4. Danelljan M, Bhat G, Shahbaz Khan F, Felsberg M (2017) Eco: Efficient convolution operators for tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition, IEEE, pp 6638–6646

  5. Danelljan M, Häger G, Khan F, Felsberg M (2014) Accurate scale estimation for robust visual tracking. In: Proceedings of British machine vision conference, BMVA Press, pp 1–11

  6. Danelljan M, Häger G, Khan FS, Felsberg M (2016) Discriminative scale space tracking. IEEE Trans Pattern Anal Mach Intell 39(8):1561–1575

    Article  Google Scholar 

  7. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2015) Convolutional features for correlation filter based visual tracking. In: Proceedings of the IEEE international conference on computer vision workshops, pp 58–66

  8. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2015) Learning spatially regularized correlation filters for visual tracking. In: Proceedings of international conference on computer vision, pp 4310–4318

  9. Dinh TB, Vo N, Medioni G (2011) Context tracker: Exploring supporters and distracters in unconstrained environments. In: Proceedings of IEEE conference on computer vision and pattern recognition, IEEE, pp 1177–1184

  10. Fan J, Song H, Zhang K, Liu Q, Yan F, Lian W (2020) Real-time manifold regularized context-aware correlation tracking. Frontiers of Computer Science 14(2):334–348

    Article  Google Scholar 

  11. Feng W, Han R, Guo Q, Zhu J, Wang S (2019) Dynamic saliency-aware regularization for correlation filter-based object tracking. IEEE Trans Image Process 28(7):3232–3245

    Article  MathSciNet  Google Scholar 

  12. Fu C, He Y, Lin F, Xiong W (2020) Robust multi-kernelized correlators for uav tracking with adaptive context analysis and dynamic weighted filters. Neural Comput Applic, pp 1–17. https://doi.org/10.1007/s00521-020-04716-x

  13. Fu C, Xu J, Lin F, Guo F, Liu T, Zhang Z (2020) Object saliency-aware dual regularized correlation filter for real-time aerial tracking. IEEE Trans Geosci Remote Sens 58(12):8940–8951

    Article  Google Scholar 

  14. Han R, Guo Q, Feng W (2018) Content-related spatial regularization for visual object tracking. In: Proceedings of IEEE international conference on multimedia and expo, IEEE, pp 1–6

  15. Held D, Thrun S, Savarese S (2016) Learning to track at 100 fps with deep regression networks. In: Proceedings of European conference on computer vision, Springer, pp 749–765

  16. Henriques JF, Caseiro R, Martins P, Batista J (2012) Exploiting the circulant structure of tracking-by-detection with kernels. In: Proceedings of European conference on computer vision, Springer, pp 702–715

  17. Henriques JF, Caseiro R, Martins P, Batista J (2014) High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 37(3):583–596

    Article  Google Scholar 

  18. Huang Y, Li X, Yang X, Qi N, Lu R, Zhang S (2021) Advances in visual object tracking algorithm based on correlation filter. Syst Eng Electron 43(8):2051–2065

    Google Scholar 

  19. Javanmardi M, Qi X (2020) Appearance variation adaptation tracker using adversarial network. Neural Netw 129:334–343

    Article  Google Scholar 

  20. Kiani Galoogahi H, Fagg A, Lucey S (2017) Learning background-aware correlation filters for visual tracking. In: Proceedings international conference on computer vision, pp 1135–1143

  21. Kumar A, Walia GS, Sharma K (2020) Recent trends in multicue based visual tracking: a review. Expert Syst Appl 162(113):711. https://doi.org/10.1016/j.eswa.2020.113711

    Article  Google Scholar 

  22. Lee DH (2021) Cnn-based single object detection and tracking in videos and its application to drone detection. Multimed Tools Appl 80:34,237–34,248

    Article  Google Scholar 

  23. Li F, Tian C, Zuo W, Zhang L, Yang MH (2018) Learning spatial-temporal regularized correlation filters for visual tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition, IEEE, pp 4904–4913

  24. Li Y, Zhu J (2014) A scale adaptive kernel correlation filter tracker with feature integration. In: Proceedings of European conference on computer vision, Springer, pp 254–265

  25. Liu S, Liu D, Srivastava G, Polap D, Wożniak M. (2021) Overview of correlation filter based algorithms in object tracking. Complex Intell Syst 7:1895–1917

    Article  Google Scholar 

  26. Marvasti-Zadeh SM, Cheng L, Ghanei-Yakhdan H, Kasaei S (2021) Deep learning for visual tracking: a comprehensive survey. IEEE Trans Intell Transp Syst, pp 1–26. https://doi.org/10.1109/TITS.2020.3046478

  27. Mueller M, Smith N, Ghanem B (2016) A benchmark and simulator for uav tracking. In: European conference on computer vision. Springer, Amsterdam, The Netherlands, pp 445–461

  28. Mueller M, Smith N, Ghanem B (2017) Context-aware correlation filter tracking. In: Proceedings of international conference on computer vision and pattern recognition, pp 1396–1404

  29. Qi Y, Zhang S, Qin L, Huang Q (2019) Hedging deep features for visual tracking. IEEE Trans Pattern Anal Mach Intell 41(5):1116–1130

    Article  Google Scholar 

  30. Qin Y, Lu H, Xu Y, Wang H (2015) Saliency detection via cellular automata. In: Proceedings of IEEE conference on computer vision and pattern recognition, pp 110–119

  31. Sevilla-Lara L, Learned-Miller E (2012) Distribution fields for tracking. In: Proceedings of IEEE conference on computer vision and pattern recognition, IEEE, pp 1910–1917

  32. She Y, Yi Y (2020) Learning multi-feature based spatially regularized and scale adaptive correlation filters for visual tracking. In: International conference on multimedia modeling. Springer, Daejeon, Korea, pp 480–491

  33. Smeulders AW, Chu DM, Cucchiara R, Calderara S, Dehghan A, Shah M (2013) Visual tracking: an experimental survey. IEEE Trans Pattern Anal Mach Intell 36(7):1442–1468

    Google Scholar 

  34. Song Y, Ma C, Wu X, Gong L, Bao L, Zuo W, Shen C, Lau RW, Yang MH (2018) Vital: Visual tracking via adversarial learning. In: Proceedings of the IEEE conference on computer vision and pattern recognition, IEEE, pp 8990–8999

  35. Tu F, Ge SS, Tang Y, Hang CC (2018) Saliency guided hierarchical robust visual tracking. arXiv preprint arXiv:1812.08973

  36. Wang F, Wang X, Tang J, Luo B, Li C (2020) Vtaan: Visual tracking with attentive adversarial network. Cogtive Computation, pp 1–11. https://doi.org/10.1007/s12559-020-09727-3

  37. Wang M, Liu Y, Huang Z (2017) Large margin object tracking with circulant feature maps. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4021–4029

  38. Wu Y, Lim J, Yang MH (2015) Object tracking benchmark. IEEE Trans Pattern Anal Mach Intell 37(9):1834–1848

    Article  Google Scholar 

  39. Xu T, Feng ZH, Wu XJ, Kittler J (2019) Learning adaptive discriminative correlation filters via temporal consistency preserving spatial feature selection for robust visual object tracking. IEEE Trans Image Process 28(11):5596–5609

    Article  MathSciNet  Google Scholar 

  40. Yu Y, Xiong Y, Huang W, Scott MR (2020) Deformable siamese attention networks for visual object tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition, IEEE, pp 6727–6736

  41. Yuan Y, Chu J, Leng L, Miao J, Kim BG (2020) A scale-adaptive object-tracking algorithm with occlusion detection. EURASIP Journal on Image and Video Processing 2020:1–15. https://doi.org/10.1186/s13640-020-0496-6

    Article  Google Scholar 

  42. Zhang T, Ghanem B, Liu S, Ahuja N (2012) Robust visual tracking via multi-task sparse learning. In: Proceedings of IEEE conference on computer vision and pattern recognition, IEEE, pp 2042–2049

  43. Zhang Y, Yang Y, Zhou W, Shi L, Li D (2018) Motion-aware correlation filters for online visual tracking. Sensors 18(11):3937

    Article  Google Scholar 

  44. Zhao J, Lu Y, Zhou Z (2020) Correlation filters based on temporal regularization and background awareness. Comput Electr Eng 86(106757):1–16

    Google Scholar 

  45. Zhou C, Jiang S, Li S, Lan X (2021) Efficient and practical correlation filter tracking. Sensors 21(790):1–17

    Google Scholar 

  46. Zhu Z, Wang Q, Li B, Wu W, Yan J, Hu W (2016) Distractor-aware siamese networks for visual object tracking. In: Proceedings of European conference on Computer Vision, Springer, pp 103–119

  47. Zuo W, Wu X, Lin L, Zhang L, Yang MH (2018) Learning support correlation filters for visual tracking. IEEE Trans Pattern Anal Mach Intell 41(5):1158–1172

    Article  Google Scholar 

Download references

Funding

This work was supported by National Natural Science Foundation of China (Grant No. 61972068, 61976042), LiaoNing Revitalization Talents Program (Grant No. XLYC2007023), Wuhan Chegu Industrial Talents Program, Innovative Talents Program for Liaoning Universities (Grant No. LR2019020).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fuming Sun.

Ethics declarations

Ethics approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Conflict of Interests

The authors declare that there is no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, F., Yin, S., Mbelwa, J.T. et al. Context and saliency aware correlation filter for visual tracking. Multimed Tools Appl 81, 27879–27893 (2022). https://doi.org/10.1007/s11042-022-12760-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12760-z

Keywords

Navigation