Skip to main content
Log in

Response map evaluation for RGBT tracking

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In recent years, RGB and thermal sensors are widely used. There is complementary information from these two types of sensors. A fundamental task which arises in this domain is RGBT tracking. It is a challenging problem to leverage RGB and thermal data. In this paper, we propose an adaptive fusion algorithm based on response map evaluation for RGBT tracking. Specifically, a hierarchical convolutional neural network is employed to extract deep features in RGB and thermal images, respectively. The target is tracked in correlation filter framework with each layer independently in RGB and thermal images. To evaluate response map of tracking status in various conditions, the average sidelobe peak response (ASPR) is proposed. Gaussian regression process is employed to provide adaptive fusion weights based on ASPR. Experimental results on two RGBT tracking datasets demonstrate the success of our method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Wu Y, Blasch E, Chen G, Bai L, Ling H (2011) Multiple source data fusion via sparse representation for robust visual tracking. In: Proceedings of international conference on information fusion

  2. Liu H, Sun F (2012) Fusion tracking in color and infrared images using joint sparse representation. Inf Sci 55(3):590–599

    MathSciNet  Google Scholar 

  3. Li C, Hu S, Gao S, Tang J (2016) Real-time grayscale-thermal tracking via laplacian sparse representation. In: Proceedings of international conference on multimedia modeling

  4. Chenglong L, Xiaohao W, Nan Z, Xiaochun C, Jin T (2018) Fusing two-stream convolutional neural networks for RGB-T object tracking. Neurocomput (NEUCOM) 281:78–85

    Article  Google Scholar 

  5. Ma C, Jia-Bin H, Xiaokang Y, Ming-Hsuan Y (2015) Hierarchical convolutional features for visual tracking. In: Proceedings of the IEEE international conference on computer vision, pp 3074–3082

  6. Qi Y, Zhang S, Qin L, Yao H, Huang Q, Lim J, Yang MH (2016) Hedged deep tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4303–4311

  7. Wang Y, Luo X, Ding L, Fu S, Wei X (2019) Detection based visual tracking with convolutional neural network. Knowl Based Syst 175:62–71

    Article  Google Scholar 

  8. Wu Y, Lim J, Yang M-H (2015) Object tracking benchmark. IEEE Trans Pattern Anal Mach Intell 37(9):1834–1848

    Article  Google Scholar 

  9. Li C, Liang X, Lu Y, Zhao N, Tang J. RGB-T object tracking: benchmark and baseline. Pattern Recognit (PR), in press

  10. Bolme DS, Beveridge JR, Draper BA, Lui YM (2010) Visual object tracking using adaptive correlation filters. In: Computer vision and pattern recognition (CVPR). IEEE conference on 2544–2550

  11. Henriques JF, Caseiro R, Martins P, Batista J (2015) High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 37(3):583–596

    Article  Google Scholar 

  12. Danelljan M, Hager G, Khan F, Felsberg M (2014) Accurate scale estimation for robust visual tracking. In: Proceedings of British machine vision conference (BMVC), pp 1–11

  13. Li Y, Zhu J (2014) A scale adaptive kernel correlation filter tracker with feature integration. In: European conference on computer vision

  14. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2015) Learning spatially regularized correlation filters for visual tracking. In: Proceedings of the IEEE international conference on computer vision

  15. Galoogahi HK, Fagg A, Lucey S (2017) Learning background aware orrelation filters for visual tracking. In: Proceedings of IEEE international conference on computer vision

  16. Zhuojin S, Yong W, Robert L (2019) Hard negative mining for correlation filters in visual tracking. Mach Vis Appl 30(3):487–506

    Article  Google Scholar 

  17. Danelljan M, Khan FS, Felsberg M, Van De Weijer J (2014) Adaptive color attributes for real-time visual tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 1090–1097

  18. Danelljan M, Hager G, Khan F, Felsberg M (2015) Convolutional features for correlation filter based visual tracking, ICCV workshop

  19. Danelljan M, Bhat G, Khan FS et al. (2017) ECO: efficient convolution operators for tracking. In: CVPR

  20. Li C, Sun X, Wang X, Zhang L, Tang J (2017) Grayscale-thermal object tracking via multi-task laplacian sparse representation. IEEE Trans Syst Man Cybern Syst 47(4):673–681

    Article  Google Scholar 

  21. Li C, Cheng H, Hu S, Liu X, Tang J, Lin L (2016) Learning collaborative sparse representation for grayscale-thermal tracking. IEEE Trans Image Process 25(12):5743–5756

    Article  MathSciNet  Google Scholar 

  22. Tu Z, Lin C, Li C, Tang J, Luo B (2020) M5L: multi-modal multi-margin metric learning for RGBT tracking. CoRR abs/2003.07650

  23. Li C, Liu Lei L, Qing AJ, Tang J (2020) Challenge-aware RGBT tracking. ECCV 22:222–237

    Google Scholar 

  24. Lu A, Li C, Yan Y, Tang J, Luo B (2020) RGBT tracking via multi-adapter network with hierarchical divergence loss. CoRR abs/2011.07189

  25. Lu A, Qian C, Li C, Tang J, Wang L (2020) Duality-gated mutual condition network for RGBT tracking. CoRR abs/2011.07188

  26. Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. In: Proceedings of ICLR, San Diego, CA, USA

  27. Deng J et al. (2009) ImageNet: a large-scale hierarchical image database. In: Proceedings of CVPR, Miami, FL, USA, pp 248–255

  28. Choi J, Chang HJ, Yun S, Fischer T, Demiris Y, Choi JY (2017) Attentional correlation filter network for adaptive visual tracking. In: Proceedings of IEEE conference on computer vision and pattern recognition

  29. Li Y, Zhu J, Hoi SC (2015) Reliable patch trackers: robust visual tracking by exploiting reliable patches. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 353–361

  30. Hong Z, Chen Z, Wang C, Mei X, Prokhorov D, Tao D (2015) MUltistore tracker (MUSTer): a cognitive psychology inspired approach to object tracking. In: Proceedings of the IEEE conference computer vision and pattern recognition (CVPR), June, pp 749–758

  31. Wang D, Lu H (2014) Visual tracking via probability continuous outlier model. In: Proceedings of the IEEE conference computer vision and pattern recognition (CVPR), pp 3478–3485

  32. Zhang J, Ma S, Sclaroff S (2014) MEEM: robust tracking via multiple experts using entropy minimization. In: ECCV, pp 188–203

  33. Zhang K, Zhang L, Liu Q, Zhang D, Yang M-H (2014) Fast visual tracking via dense spatio-temporal context learning. In: Proceedings of European conference on computer vision, pp 127–141

  34. Zhang K, Zhang L, Yang MH (2012) Real-time compressive tracking. in Proceedings of European conference on computer vision, pp 864–877

  35. Zhong W, Lu H, Yang MH (2012) Robust object tracking via sparsity based collaborative model. In: Proceedings of the IEEE conference on computer vision and pattern recognition, June, pp 1838–1845

  36. Hare S, Saffari A, Torr PHS (2011) Struck: structured output tracking with kernels. In: Proceedings of the IEEE conference on computer vision and pattern recognition, November, pp 263–270

  37. Kalal Z, Mikolajczyk K, Matas J (2012) Tracking-learning-detection. IEEE Trans Pattern Anal Mach Intell 34(7):1409–1422

    Article  Google Scholar 

  38. Yun S, et al. (2017) Action-decision networks for visual tracking with deep reinforcement learning. In: Proceedings of IEEE conference on computer vision and pattern recognition

  39. Babenko B, Yang M-H, Belongie S (2011) Robust object tracking with online multiple instance learning. IEEE Trans Pattern Anal Mach Intell 33(8):1619–1632

    Article  Google Scholar 

  40. Valmadre J, Bertinetto L, Henriques J, Vedaldi A, Torr PH (2017) End-to-end representation learning for correlation filter based tracking. In: Computer vision and pattern recognition (CVPR), 2017 IEEE conference on, pp 5000–5008

  41. Lukezic A, Vojir T, Cehovin L, Matas J, Kristan M (2017) Discriminative correlation filter with channel and spatial reliability. In: Proceedings of the IEEE conference on computer vision and pattern recognition

  42. Kim HU, Lee DY, Sim JY, Kim CS (2015) Sowp: spatially ordered and weighted patch descriptor for visual tracking. In: Proceedings of IEEE international conference on computer vision

  43. Li C, Zhao N, Lu Y, Zhu C, Tang J (2017) Weighted sparse representation regularized graph learning for RGBT object tracking. In: Proceedings of ACM international conference on multimedia

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Xian Wei or Xuan Tang.

Ethics declarations

Conflict of interest

We declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work. There is no professional or other personal interest of any nature or kind in any product, service or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work has been supported by the German Research Foundation (DFG) under Grant No. KL 2189/9-1, and the National Science Found for Young Scholars under Grant No. 61806186.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, Y., Wei, X., Tang, X. et al. Response map evaluation for RGBT tracking. Neural Comput & Applic 34, 5757–5769 (2022). https://doi.org/10.1007/s00521-021-06704-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06704-1

Keywords

Navigation