Skip to main content
Log in

Visual object tracking via collaborative correlation filters

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Correlation filter (CF) theory has gained a sustainable attractiveness on the tracking field by virtue of its efficiency in training and decision stage. The seminal idea of cyclic shift operations on image patch makes the CF-based trackers accomplish the dense sampling scheme in an efficient manner. Due to all shifted samples merely provide the single appearance of the target in the previous frame, the filters cannot acquire sufficient information of the target appearance, which leads to the tracking failed. To tackle this problem, we designed a novel tracker using collaborative correlation filters, which collaboratively exploit the historical appearances of the target and the surrounding contexts of the previous frame for the filters learning. Therefore, the collaborative correlation filter simultaneously acquires the variations in target appearance and promotes discriminability when the target suffers from occlusion or motion blur. In addition, a unified optimization procedure is proposed to enable abundant information to be embedded in filter learning, which obtains a closed-form solution. Since the historical appearances of the target are collected from previous tracking results and the target appearance may change drastically or suffer from occlusion, using all of these templates for training is inappropriate. To eliminate the useless samples from the template set, we provide a template updating strategy based on collaborative representation. The experimental results on the standard \(\hbox {OTB}100\) benchmark data sets illustrate the robustness of the proposed tracker.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Elafi, I., Jedra, M., Zahid, N.: Tracking occluded objects using chromatic co-occurrence matrices and particle filter. Signal Image Video Process. 12(11), 1–9 (2018)

    Google Scholar 

  2. Li, L., Zhe, C., Zhen, Z., Fan, T., Xu, L.: Object tracking based on support vector dictionary learning. Signal Image Video Process. 12(10), 1–8 (2018)

    Google Scholar 

  3. Wu, L.-L., Liu, Z., Huang, Z.: Online source number estimation based on sequential hypothesis test and subspace tracking. Signal Image Video Process. 13(2), 307–311 (2019)

    Article  Google Scholar 

  4. Topkaya, I.S., Erdogan, H.: Using spatial overlap ratio of independent classifiers for likelihood map fusion in mean-shift tracking. Signal Image Video Process. 13, 1–7 (2018)

    Google Scholar 

  5. Wang, Z., Hao, W., Fang, B., Xie, C.: Support vector correlation filter with long-term tracking. Signal Image Video Process. 12(1), 1–9 (2018)

    Article  Google Scholar 

  6. Supreeth, H.S.G., Patil, C.M.: Efficient multiple moving object detection and tracking using combined background subtraction and clustering. Signal Image Video Process. 12(9), 1–9 (2018)

    Google Scholar 

  7. Xue, M., Ling, H.: Robust visual tracking using \(\ell _1\) minimization. In: IEEE International Conference on Computer Vision (2009)

  8. Li, H., Shen, C., Shi, Q.: Real-time visual tracking using compressive sensing. In: IEEE Conference on Computer Vision and Pattern Recognition (2011)

  9. Avidan, S.: Ensemble tracking. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (2005)

  10. Babenko, B., Yang, M.H., Belongie, S.: Visual tracking with online multiple instance learning. In: IEEE Conference on Computer Vision and Pattern Recognition (2009)

  11. Bai, Y., Ming, T.: Robust tracking via weakly supervised ranking svm. In: IEEE Conference on Computer Vision and Pattern Recognition (2012)

  12. Xie, C., Tan, J., Peng, C., Jie, Z., Lei, H.: Collaborative object tracking model with local sparse representation. J. Vis. Commun. Image Represent. 25(2), 423–434 (2014)

    Article  Google Scholar 

  13. Zhou, T., Yao, L., Di, H.: Locality-constrained collaborative model for robust visual tracking. IEEE Trans. Circuits Syst. Video Technol. 27(2), 313–325 (2017)

    Article  Google Scholar 

  14. Liu, Y., Yang, F., Zhong, C., Tao, Y., Dai, B., Yin, M.: Visual tracking via salient feature extraction and sparse collaborative model. AEU Int. J. Electron. Commun. 87, 134–143 (2018)

    Article  Google Scholar 

  15. Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: 2010 IEEE Computer Vision and Pattern Recognition. IEEE, pp. 2544–2550 (2010)

  16. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: European Conference on Computer Vision pp. 702–715 (2012)

  17. Danelljan, M., Häger, G., Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. In: British Machine Vision Conference, Nottingham, September 1–5, 2014. BMVA Press, London (2014)

  18. Lai, Z., Wong, W.K., Xu, Y., Yang, J., Zhang, D.: Approximate orthogonal sparse embedding for dimensionality reduction. IEEE Trans. Neural Nete. Learn. Syst. 27(4), 723–735 (2017)

    Article  MathSciNet  Google Scholar 

  19. Wang, X., Wang, Y., Wan, W., Hwang, J.N.: Object tracking with sparse representation and annealed particle filter. Signal Image Video Process. 8(6), 1059–1068 (2014)

    Article  Google Scholar 

  20. Lai, Z., et al.: Generalized robust regression for jointly sparse subspace learning. IEEE Trans. Circuits Syst. Video Technol 29(3), 756–772 (2018)

    Article  Google Scholar 

  21. You, X., Xin, L., He, Z., Zhang, X.F.: A robust local sparse tracker with global consistency constraint. Signal Process. 111(C), 308–318 (2015)

    Article  Google Scholar 

  22. He, Z., Yi, S., Cheung, Y.M., You, X., Tang, Y.Y.: Robust object tracking via key patch sparse representation. IEEE Trans. Cybern. 47(2), 354–364 (2017)

    Google Scholar 

  23. Yang, Y., Hu, W., Xie, Y., Zhang, W., Zhang, T.: Temporal restricted visual tracking via reverse-low-rank sparse learning. IEEE Trans. Cybern. PP(99), 485–498 (2017)

    Google Scholar 

  24. Sheikholeslami, F., Berberidis, D., Giannakis, G.B.: Large-scale kernel-based feature extraction via low-rank subspace tracking on a budget. IEEE Trans. Signal Process. PP(99), 1 (2018)

    MathSciNet  MATH  Google Scholar 

  25. Lei, Z., Meng, Y., Feng, X.: Sparse representation or collaborative representation: which helps face recognition? In: International Conference on Computer Vision (2012)

  26. Deldjoo, Y., Zhang, S., Ebrahimiatani, R., Molla-Abbasi, M.: Is it the sparsity or collaborativeness that makes a visual tracker strong? In: International Symposium on Telecommunications (2015)

  27. Wibowo, S.A., Lee, H., Kim, E.K., Kim, S.: Collaborative learning based on convolutional features and correlation filter for visual tracking. Int. J. Control Autom. Syst. 16(1), 335–349 (2018)

    Article  Google Scholar 

  28. Chen, Z., Hong, Z., Tao, D.: An experimental survey on correlation filter-based tracking. Preprint arXiv:1509.05520

  29. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)

    Article  Google Scholar 

  30. Danelljan, M., Khan, F.S., Felsberg, M., Weijer, J.V.D.: Adaptive color attributes for real-time visual tracking. In: IEEE Conference on Computer Vision and Pattern Recognition (2014)

  31. Li, X., Liu, Q., He, Z., Wang, H., Zhang, C., Chen, W.S.: A multi-view model for visual tracking via correlation filters. Knowl. Based Syst. 113(C), 88–99 (2016)

    Article  Google Scholar 

  32. Zhang, K., Lei, Z., Liu, Q., Zhang, D., Yang, M.H.: Fast visual tracking via dense spatio-temporal context learning. In: European Conference on Computer Vision (2014)

  33. Li, Y., Zhu, J., Hoi, S.C.: Reliable patch trackers: robust visual tracking by exploiting reliable patches. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 353–361 (2015)

  34. Wu, Y., Lim, J., Yang, M.H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)

    Article  Google Scholar 

  35. Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking–learning–detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)

    Article  Google Scholar 

  36. Zhong, W., Lu, H., Yang, M.H.: Robust object tracking via sparsity-based collaborative model. In: 2012 IEEE Computer Vision and Pattern Recognition. IEEE, pp. 1838–1845 (2012)

  37. Jia, X., Lu, H., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model. In: IEEE Conference on Computer Vision and Pattern Recognition (2012)

  38. Hare, S., Golodetz, S., Saffari, A., et al.: Struck: structured output tracking with kernels. IEEE. Trans. Pattern Anal. Mach. Intell. 38(10), 2096–2109 (2015)

    Article  Google Scholar 

  39. Mueller, M., Smith, N., Ghanem, B.: Context-aware correlation filter tracking. In: IEEE Conference on Computer Vision and Pattern Recognition (2017)

  40. Wu, Y., Lim, J., Yang, M.-H.: Online object tracking: a benchmark. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition pp. 2411–2418 (2013)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhenyu He.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lu, X., Li, J., He, Z. et al. Visual object tracking via collaborative correlation filters. SIViP 14, 177–185 (2020). https://doi.org/10.1007/s11760-019-01540-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-019-01540-2

Keywords

Navigation