Abstract
The discriminative correlation filter-based tracking algorithms cannot correctly track the target if the target is occluded or out of view and reappears in the field of vision, and they cannot ensure the tracking model is updated correctly if the tracking information is not correct. In this paper, a robust correlation tracking algorithm is proposed. Here, a failure detection strategy, which is based on the maximal confidence score and peak-to-sidelobe ratio to detect or measure the reliability of the tracking result, is integrated into the tracker. Moreover, the redetection module based on the keypoints matching method for consensus voting is introduced into the proposed tracking algorithm to redetect objects in case of tracking failure. In addition, an adaptive high-confidence updating method is proposed to avoid error model information introduced into the tracker caused by occlusions, out-of-view or illumination changes, where the learning rate is determined by the change rate of the confidence map. The OTB-2015 dataset and VOT-2016 dataset are used to evaluate the performance of the proposed tracking algorithm. The experimental results show that the proposed tracking algorithm performs better than most of the state-of-the-art trackers, and it has higher accuracy and robustness than the DSST tracker.
Similar content being viewed by others
References
Wu, Y., Lim, J., Yang, M.H.: Online object tracking: a benchmark. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2411-2418 (2013)
Wu, Y., Lim, J., Yang, M.H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)
Lu, H., Jia, X., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1822-1829 (2012)
He, S.F., Yang, Q., Lau, R., Wang, J.: Visual tracking via locality sensitive histograms. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2427–2434 (2013)
Vojir, T., Noskova, J., Matas, J.: Robust scale-adaptive mean-shift for tracking. Pattern Recogn. Lett. 49, 250–258 (2014)
Danelljan, Y.M., Khan, F.S., Felsberg, M.: Adaptive color attributes for real-time visual tracking. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1090-1097 (2014)
Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Softw. Eng. 34(7), 1409–1422 (2011)
Hare, S., Saffari, A., Torr, P.H.S.: Struck: structured output tracking with Kernels. IEEE Trans. Pattern Anal. Mach. Intell. 38(10), 2096–2109 (2016)
Lu, D., Li, L.S., Yan, Q.S.: A survey: target tracking algorithm based on sparse representation. In: Seventh International Symposium on Computational Intelligence and Design (2015)
Zhang, K.H., Zhang, L., Yang, M.H.: Real-time compressive tracking. In: European Conference on Computer Vision, pp. 864–877 (2012)
Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 2544-2550 (2010)
João, F., Henriques., Caseiro, R., Martins, P.: Exploiting the circulant structure of tracking-by-detection with Kernels. In: 2012 European Conference on Computer Vision, pp. 702-715 (2012)
Henriques, J.F., Caseiro, R., Martins, P.: High-speed tracking with Kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)
Danelljan, M., Khan, F.S., Felsberg, M.: Adaptive color attributes for real-time visual tracking. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1090–1097 (2014)
Danelljan, M., Häger, G., Khan, F.S.: Accurate scale estimation for robust visual tracking. In: Proceedings of British Machine Vision Conference (2014)
Wang, M.M., Liu, Y., Huang, Z.: Large margin object tracking with circulant feature maps. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition, pp. 4800–4808 (2017)
Danelljan, M., Häger, G., Gustav., Khan, FS.: Learning spatially regularized correlation filters for visual tracking. In: 2015 IEEE International Conference on Computer Vision, pp. 4310–4318 (2015)
Zhang, D., Zhang, Z., Zou, L.: Part-based visual tracking with spatially regularized correlation filters. Vis. Comput. 36, 509–527 (2020). https://doi.org/10.1007/s00371-019-01634-5
Zhang, H., Liu, G.: Coupled-layer based visual tracking via adaptive kernelized correlation filters. Visual Comput. 34(1), 41–54 (2018)
Li, Y., Zhu, J.: A scale adaptive kernel correlation filter tracker with feature integration, pp. 254–265 (2014)
Lukežič, A., Tomáš, V., Luka, Č.: Discriminative correlation filter with channel and spatial reliability. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition, pp. 4847–4856 (2017)
Danelljan, M., Häger, G., Khan, F.S.: Convolutional features for correlation filter based visual tracking. In: 2015 IEEE International Conference on Computer Vision Workshop, pp. 621–629 (2015)
Danelljan, M., Robinson, A., Khan, F.S.: Beyond correlation filters: learning continuous convolution operators for visual tracking. In: European Conference on Computer Vision, pp. 472–488 (2016)
Ma, C., Yang, X., Zhang, N.C.: Long-term correlation tracking. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition, pp. 5388–5396 (2015)
Li, C.L., Lin, L., Zuo, W.M.: Visual tracking via dynamic graph learning. IEEE Trans. Pattern Anal. Mach. Intell. 41(11), 2770–2782 (2019)
Kristan, M., Leonardis, A., Matas, J.: The visual object tracking VOT2016 challenge results. In: IEEE International Conference on Computer Vision Workshops, pp. 191–217 (2016)
Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)
Nebehay, G., Pflugfelder, R.: Clustering of static-adaptive correspondences for deformable object tracking. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2784–2791 (2015)
Xu, R., Wunsch, D.: Survey of clustering algorithms. IEEE Trans. Neural Netw. 16(3), 645–678 (2005)
Nebehay, G., Pflugfelder, R.: Consensus-based matching and tracking of keypoints for object tracking. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2784–2791 (2015)
Bertinetto, L., Valmadre, J., Golodetz, S.: Staple: complementary learners for real-time tracking. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1401–1409 (2016)
Leutenegger, S., Chli, M., Siegwart, R.Y.: BRISK: binary robust invariant scalable keypoints. In: 2011 International Conference on Computer Vision, pp. 2548–2555 (2011)
Danelljan, M., Häger, Gustav., Khan, F.S.: Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (2016)
Zhang, J.M., Ma, S.G., Sclaroff, S.: MEEM: Robust tracking via multiple experts using entropy minimization. In: European Conference on Computer Vision, pp. 188–203 (2014)
Acknowledgements
This work was partially supported by the Natural Science Foundation of China (No. 61603274), the Natural Science Foundation of Tianjin (No. 18JCYBJC87700), South African National Research Foundation Grants (Nos. 112108 and 112142), and South African National Research Foundation Incentive Grant (No. 114911), and Eskom Tertiary Education Support Programme Grant.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix
This appendix contains additional experimental results.
Analyze the effectiveness of PSR
Here, we added some experiment to analyze the effectiveness of PSR in Sect. 3.2. As shown in Fig. 14, the experimental results show that the maximal confidence score and the PSR can reflect the confidence degree about the tracking performance to some extent. Therefore, the maximal confidence score and PSR are used as reference for judging whether the tracking result is reliable.
Additional results on out-of-view dataset
Here, we provide further experimental evaluation on 14 out-of-view videos in the OTB-2015 dataset. As shown in Table 5, it shows the average overlap accuracy of the proposed method for each sequence and is compared to seven state-of-the-art trackers.
Additional results on OTB-2015 dataset
Here, as shown in Fig. 15, we give the average precision for each of the 11 challenging attributes.
Rights and permissions
About this article
Cite this article
Dong, E., Deng, M. & Wang, Z. A robust tracking algorithm with on online detector and high-confidence updating strategy. Vis Comput 37, 567–585 (2021). https://doi.org/10.1007/s00371-020-01824-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00371-020-01824-6