Skip to main content
Log in

Convolution operators for visual tracking based on spatial–temporal regularization

  • S.I. : ATCI 2019
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In recent years, the method based on discriminative correlation filter has been shown excellent performance in short-term visual tracking. However, discriminative correlation filter-based method heavily suffers from the problem of the multiple peaks and model drift in responds maps incurred by occlusion and rotation. To solve the above problem, we proposed convolution operators for visual tracking based on spatial–temporal regularization. Firstly, we add spatial–temporal regularization in loss function, which will guarantee continuity of the model in time. And we use preconditioned conjugate gradient algorithm to obtain filter coefficients. Secondly, we proposed channel reliability to estimate quality of the learned filter and fuse the different reliability coefficients to weight response map in location. We set a threshold to reduce the number of iteration in location and accelerate the compute speed of algorithm. Finally, we use two different correlation filters to estimate location and scale of target, respectively. Extensively experiment in five video sequences show that our tracker has been significantly improved performance in case of occlusion and rotation. The AUC in success plot improves 33.2% than ECO-HC and 41.5% than STRCF, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Yin G, Liu B, Zhu H et al (2019) A large scale urban surveillance video dataset for multiple-object tracking and behavior analysis. CoRR. arXiv:1904.11784

  2. Peng C, Cao D, Wu Y et al (2019) Robot visual guide with Fourier-Mellin based visual tracking. Front Optoelectron 12(4):413–421

    Article  Google Scholar 

  3. Huang R, Liang H, Chen J et al (2016) Lidar based dynamic obstacle detection, tracking and recognition method for driverless cars. Robot 38:437–443

    Google Scholar 

  4. Cooley JW, Lewis PAW, Welch PD (1988) The fast fourier transform and its applications. IEEE Trans Educ 12(1):27–34

    Article  Google Scholar 

  5. Bolme DS, Beveridge JR, Draper BA, Lui YM (2010) Visual object tracking using adaptive correlation filters. In: CVPR

  6. Henriques J, Caseiro R, Martins P, Batista J (2015) Highspeed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 2015(37):583–596

    Article  Google Scholar 

  7. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2015) Learning spatially regularized correlation filters for visual tracking. In: ICCV, pp 4310–4318

  8. Galoogahi HK, Fagg A, Lucey S (2017) Learning background-aware correlation filters for visual tracking

  9. Boyd S, Parikh N, Chu E et al (2010) Distributed optimization and statistical learning via the alternating direction method of multipliers. Found Trends Mach Learn 3(1):1–122

    Article  Google Scholar 

  10. Lukezic A, Vojr T, Cehovin Zajc L, Matas J, Kristan M (2017) Discriminative correlation filter with channel and spatial reliability. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4847–4856

  11. Galoogahi HK, Sim T, Lucey S (2015) Correlation filters with limited boundaries. In: CVPR, pp 4630–4638

  12. Li Y, Zhu J (2014) A scale adaptive kernel correlation filter tracker with feature integration. In: ECCV workshops

  13. Danelljan M, Hager G, Khan F, Felsberg M (2017) Discriminative scale space tracking. In: TPAMI

  14. Wang N, Zhou W, Tian Q et al (2018) Multi-cue Correlation filters for robust visual tracking. In: 2018 IEEE/CVF conference on computer vision and pattern recognition (CVPR). IEEE

  15. Danelljan M, Hager G, Khan FS et al (2015) Convolutional features for correlation filter based visual tracking. In: 2015 IEEE international conference on computer vision workshop (ICCVW). IEEE Computer Society

  16. Danelljan M, Robinson A, Shahbaz Khan F, Felsberg M (2016) Beyond correlation filters: learning continuous convolution operators for visual tracking. In: ECCV

  17. Danelljan M, Bhat G, Shahbaz Khan F, Felsberg M (2017) ECO: efficient convolution operators for tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 6931–6939

  18. Wang J, Zhang F, Huang J, Wang W, Yuan C (2019) A nonconvex penalty function with integral convolution approximation for compressed sensing. Sig Process 158:116–128

    Article  Google Scholar 

  19. Xu H, Caramanis C, Sanghavi S (2012) Robust PCA via outlier pursuit. IEEE Trans Inf Theory 58(5):3047–3064

    Article  MathSciNet  Google Scholar 

  20. Li F, Cheng T, Zuo W et al (2018) Learning spatial-temporal regularized correlation filters for visual tracking. In: IEEE conference on computer vision and pattern recognition, Salt Lake City, pp 4904–4913

  21. Xu T, Feng Z, Wu X, Kittler J (2019) Learning adaptive discriminative correlation filters via temporal consistency preserving spatial feature selection for robust visual tracking. IEEE Trans Image Process 28(11):5596–5609

    Article  MathSciNet  Google Scholar 

  22. Sun C, Wang D, Lu H, Yang M (2018) Correlation tracking via joint discrimination and reliability learning. In: Proceedings of European conference on computer vision

  23. Johnander J, Danelljan M, Khan FS et al (2017) DCCO: towards deformable continuous convolution operators

  24. Gladh S, Danelljan M, Khan FS et al (2016) Deep motion features for visual tracking. pp 1243–1248

  25. Crammer K, Dekel O, Keshet J, Shalev-Shwartz S, Singer Y (2006) Online passive-aggressive algorithms. JMLR 7(3):551–585

    MathSciNet  MATH  Google Scholar 

  26. Wu Y, Lim J, Yang MH (2015) Object tracking benchmark. IEEE Trans Pattern Anal Mach Intell 37(9):1834–1848

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by Scientific Research Program Funded by Shaanxi Science and Technology Department (2019GY-022, 2019GY-066), National Natural Science Foundation of China (61671362), Science and Technology Program of Weiyang District Science and Technology Department (201923).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peng Wang.

Ethics declarations

Conflict of interest

There are no conflicts of interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, P., Sun, M., Wang, H. et al. Convolution operators for visual tracking based on spatial–temporal regularization. Neural Comput & Applic 32, 5339–5351 (2020). https://doi.org/10.1007/s00521-020-04704-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-04704-1

Keywords

Navigation