Skip to main content
Log in

RGBT tracking via reliable feature configuration

  • Research Paper
  • Published:
Science China Information Sciences Aims and scope Submit manuscript

Abstract

There is an increasing interest in RGBT tracking recently because of the complementary benefits of RGB and thermal infrared data. However, the reliability of each modality will change over time quite possibly, and the modality with bad reliability will disturb tracking performance. Thus, we propose a novel reliability-based feature configuration approach in the correlation filter framework for robust RGBT tracking. Specifically, we configure a feature set based on RGB, thermal, and RGBT data. To measure the reliabilities of different feature configurations, we equip each feature configuration with a tracker and design a guideline judging whether the tracker is reliable. We use the tracker with the best reliability for tracking. Experimental results show that the proposed tracker achieves promising performance against other RGBT tracking methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Nam H, Han B. Learning multi-domain convolutional neural networks for visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, 2016. 4293–4302

  2. Bertinetto L, Valmadre J, Henriques J F, et al. Fully-convolutional siamese networks for object tracking. In: Proceedings of the European Conference on Computer Vision, Amsterdam, 2016. 850–865

  3. Li B, Yan J, Wu W, et al. High performance visual tracking with siamese region proposal network. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, 2018. 8971–8980

  4. Dai K, Zhang Y, Wang D, et al. High-performance long-term tracking with meta-updater. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Seattle, 2020. 6298–6307

  5. Huang L, Zhao X, Huang K. GlobalTrack: a simple and strong baseline for long-term tracking. In: Proceedings of the 34th AAAI Conference on Artificial Intelligence, New York, 2020

  6. Yan B, Zhao H, Wang D, et al. ‘Skimming-Perusal’ tracking: a framework for real-time and robust long-term tracking. In: Proceedings of the IEEE International Conference on Computer Vision, Seoul, 2019. 2385–2393

  7. Bolme D S, Beveridge J R, Draper B A, et al. Visual object tracking using adaptive correlation filters. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, San Francisco, 2010. 2544–2550

  8. Henriques J F, Caseiro R, Martins P, et al. High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell, 2015, 37: 583–596

    Article  Google Scholar 

  9. Henriques J F, Caseiro R, Martins P, et al. Exploiting the circulant structure of tracking-by-detection with kernels. In: Proceedings of the European Conference on Computer Vision, Florence, 2012. 702–715

  10. Danelljan M, Hager G, Khan F S, et al. Learning spatially regularized correlation filters for visual tracking. In: Proceedings of the IEEE International Conference on Computer Vision, Santiago, 2015. 4310–4318

  11. Danelljan M, Hager G, Khan F, et al. Accurate scale estimation for robust visual tracking. In: Proceedings of the British Machine Vision Conference, Nottingham, 2014

  12. Liu H P, Sun F C. Fusion tracking in color and infrared images using joint sparse representation. Sci China Inf Sci, 2012, 55: 590–599

    Article  MathSciNet  Google Scholar 

  13. Li C, Cheng H, Hu S, et al. Learning collaborative sparse representation for grayscale-thermal tracking. IEEE Trans Image Process, 2016, 25: 5743–5756

    Article  MathSciNet  MATH  Google Scholar 

  14. Wu Y, Blasch E, Chen G, et al. Multiple source data fusion via sparse representation for robust visual tracking. In: Proceedings of the 14th International Conference on Information Fusion, Chicago, 2011. 1–8

  15. Li C, Zhao N, Lu Y, et al. Weighted sparse representation regularized graph learning for RGB-T object tracking. In: Proceedings of the 25th ACM International Conference on Multimedia, Mountain View, 2017. 1856–1864

  16. Li C, Zhu C, Huang Y, et al. Cross-modal ranking with soft consistency and noisy labels for robust RGB-T tracking. In: Proceedings of the European Conference on Computer Vision, Munich, 2018. 808–823

  17. Zhu Y, Li C, Tang J, et al. Quality-aware feature aggregation network for robust RGBT tracking. IEEE Trans Intell Veh, 2021, 6: 121–130

    Article  Google Scholar 

  18. Li C L, Lu A D, Zheng A H, et al. Multi-adapter RGBT tracking. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, Seoul, 2019

  19. Danelljan M, Khan F S, Felsberg M, et al. Adaptive color attributes for real-time visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, 2014. 1090–1097

  20. Lee D Y, Sim J Y, Kim C S. Multihypothesis trajectory analysis for robust visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, 2015. 5088–5096

  21. van de Weijer J, Schmid C, Verbeek J, et al. Learning color names for real-world applications. IEEE Trans Image Process, 2009, 18: 1512–1523

    Article  MathSciNet  MATH  Google Scholar 

  22. Li Y, Zhu J. A scale adaptive kernel correlation filter tracker with feature integration. In: Proceedings of the European Conference on Computer Vision, Zurich, 2014. 254–265

  23. Dalal N, Triggs B. Histograms of oriented gradients for human detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, San Diego, 2005. 886–893

  24. Danelljan M, Robinson A, Khan F S, et al. Beyond correlation filters: learning continuous convolution operators for visual tracking. In: Proceedings of the European Conference on Computer Vision, Amsterdam, 2016. 472–488

  25. Danelljan M, Bhat G, Khan F S, et al. Eco: efficient convolution operators for tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, 2017. 6931–6939

  26. Dai K, Wang D, Lu H, et al. Visual tracking via adaptive spatially-regularized correlation filters. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, 2019. 4670–4679

  27. Tang F, Brennan S, Zhao Q, et al. Co-tracking using semi-supervised support vector machines. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Rio de Janeiro, 2007. 992–999

  28. Gao Y, Ji R, Zhang L, et al. Symbiotic tracker ensemble toward a unified tracking framework. IEEE Trans Circ Syst Video Technol, 2014, 24: 1122–1131

    Article  Google Scholar 

  29. Kwon J, Lee K M. Tracking by sampling trackers. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Barcelona, 2011. 1195–1202

  30. Zhang J, Ma S, Sclaroff S. MEEM: robust tracking via multiple experts using entropy minimization. In: Proceedings of the European Conference on Computer Vision, Zurich, 2014. 188–203

  31. Li J, Deng C, Xu R Y D, et al. Robust object tracking with discrete graph-based multiple experts. IEEE Trans Image Process, 2017, 26: 2736–2750

    Article  MathSciNet  MATH  Google Scholar 

  32. Wang N, Zhou W, Tian Q, et al. Multi-cue correlation filters for robust visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, 2018. 4844–4853

  33. Lan X, Ye M, Zhang S, et al. Robust collaborative discriminative learning for RGB-infrared tracking. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence, New Orleans, 2018. 7008–7015

  34. Li C, Liang X, Lu Y, et al. RGB-T object tracking: benchmark and baseline. Pattern Recogn, 2019, 96: 106977

    Article  Google Scholar 

  35. Zhang Z, Peng H. Deeper and wider siamese networks for real-time visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, 2019. 4591–4600

  36. Zhong W, Lu H, Yang M H. Robust object tracking via sparsity-based collaborative model. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Providence, 2012. 1838–1845

  37. Hare S, Golodetz S, Saffari A, et al. Struck: structured output tracking with kernels. IEEE Trans Pattern Anal Mach Intell, 2016, 38: 2096–2109

    Article  Google Scholar 

  38. Valmadre J, Bertinetto L, Henriques J, et al. End-to-end representation learning for correlation filter based tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, 2017. 5000–5008

  39. Lukezic A, Vojir T, Zajc L C, et al. Discriminative correlation filter with channel and spatial reliability. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, 2017. 6309–6318

Download references

Acknowledgements

The work was supported by Natural Science Foundation of Anhui Higher Education Institution of China (Grant Nos. KJ2020A0033, KJ2019A0005, KJ2019A0026), Major Project for New Generation of AI (Grant No. 2018AAA0100400), and National Natural Science Foundation of China (Grant No. 61976003), and NSFC Key Projects in International (Regional) Cooperation and Exchanges (Grant No. 61860206004).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chenglong Li.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tu, Z., Pan, W., Duan, Y. et al. RGBT tracking via reliable feature configuration. Sci. China Inf. Sci. 65, 142101 (2022). https://doi.org/10.1007/s11432-020-3160-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11432-020-3160-5

Keywords

Navigation