Skip to main content
Log in

A novel framework for robust long-term object tracking in real-time

  • Short Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

In this paper, we study the problem of long-term object tracking, where the object may become fully occluded or leave/re-enter the camera view. In this setting, the drifting due to significant appearance change of the object and the recovery from tracking failure are two major issues. To address these two issues, we propose an intelligent framework to integrate a tracker and detector, wherein the tracker module is used to validate the output of the detector with online learning. The key insight of our work is the importance of how a tracker and detector are integrated, which has received little attention in the literature. Based on our proposed framework, a correlation filter-based tracker and a cascaded detector are utilized to implement a robust long-term tracking algorithm. Extensive experimental results show that the proposed framework performs better compared to specific choices of tracker/detector modules and to state-of-the-art tracking-and-detection methods. Additionally, we extend the proposed system with a centralized strategy to achieve cooperative tracking using multiple cameras in a laboratory setting.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. The authors of [10] report inconsistent DP and OS values in Table 1 of their paper and the corresponding descriptive text in the Results section.

  2. https://bitbucket.org/zhengxiaoxu/lrt.

References

  1. Adam, A., Rivlin, E., Shimshoni, I.: Robust fragments-based tracking using the integral histogram. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 798–805 (2006)

  2. Avidan, S.: Support vector tracking. IEEE Trans. Pattern Anal. Mach. Intell. 26(8), 1064–1072 (2004)

    Article  Google Scholar 

  3. Babenko, B., Yang, M.H., Belongie, S.: Robust object tracking with online multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1619–1632 (2011)

    Article  Google Scholar 

  4. Black, M.J., Jepson, A.D.: Eigentracking: robust matching and tracking of articulated objects using a view-based representation. Int. J. Comput. Vis. 26(1), 63–84 (1998)

    Article  Google Scholar 

  5. Bolme, D.S., Beveridge, J.R., Draper, B., Lui, Y.M., et al.: Visual object tracking using adaptive correlation filters. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 2544–2550 (2010)

  6. Collins, R., Zhou, X., Teh, S.K.: An open source tracking testbed and evaluation web site. In: IEEE International Workshop on Performance Evaluation of Tracking and Surveillance, pp. 17–24 (2005)

  7. Collins, R.T., Liu, Y., Leordeanu, M.: Online selection of discriminative tracking features. IEEE Trans. Pattern Anal. Mach. Intell. 27(10), 1631–1643 (2005)

    Article  Google Scholar 

  8. Danelljan, M., Häger, G., Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. In: Proceedings of British Machine Vision Conference (2014)

  9. Danelljan, M., Häger, G., Khan, F.S., Felsberg, M.: Discriminative scale space tracking. IEEE Trans. Pattern Anal. Mach. Intell. 39(8), 1561–1575 (2017)

    Article  Google Scholar 

  10. Dong, X., Shen, J., Yu, D., Wang, W., Liu, J., Huang, H.: Occlusion-aware real-time object tracking. IEEE Trans. Multimed. 19(4), 763–771 (2017)

    Article  Google Scholar 

  11. Grabner, H., Grabner, M., Bischof, H.: Real-time tracking via on-line boosting. In: Proceedings of British Machine Vision Conference, vol. 1, pp. 1–6 (2006)

  12. Hare, S., Saffari, A., Torr, P.H.: Struck: structured output tracking with kernels. In: Proceedings of IEEE International Conference on Computer Vision, pp. 263–270 (2011)

  13. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: Proceedings of European Conference on Computer Vision, pp. 702–715. Springer (2012)

  14. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)

    Article  Google Scholar 

  15. Isard, M., Blake, A.: Condensation: conditional density propagation for visual tracking. Int. J. Comput. Vis. 29(1), 5–28 (1998)

    Article  Google Scholar 

  16. Jepson, A.D., Fleet, D.J., El-Maraghi, T.F.: Robust online appearance models for visual tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25(10), 1296–1311 (2003)

    Article  Google Scholar 

  17. Jia, X., Lu, H., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1822–1829 (2012)

  18. Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking–learning–detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)

    Article  Google Scholar 

  19. Lepetit, V., Fua, P.: Keypoint recognition using randomized trees. IEEE Trans. Pattern Anal. Mach. Intell. 28(9), 1465–1479 (2006)

    Article  Google Scholar 

  20. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)

    Article  Google Scholar 

  21. Ma, C., Huang, J.B., Yang, X., Yang, M.H.: Hierarchical convolutional features for visual tracking. In: IEEE International Conference on Computer Vision (ICCV), pp. 3074–3082. IEEE (2015)

  22. Mei, X., Ling, H.: Robust visual tracking using \(l_1\) minimization. In: Proceedings of IEEE International Conference on Computer Vision, pp. 1436–1443 (2009)

  23. Ozuysal, M., Fua, P., Lepetit, V.: Fast keypoint recognition in ten lines of code. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–8 (2007)

  24. Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1137–1149 (2017)

    Article  Google Scholar 

  25. Ross, D.A., Lim, J., Lin, R.S., Yang, M.H.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1–3), 125–141 (2008)

    Article  Google Scholar 

  26. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp. I–511 (2001)

  27. Williams, O., Blake, A., Cipolla, R.: Sparse Bayesian learning for efficient visual tracking. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1292–1304 (2005)

    Article  Google Scholar 

  28. Wu, Y., Lim, J., Yang, M.H.: Online object tracking: a benchmark. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 2411–2418 (2013)

  29. Wu, Y., Lim, J., Yang, M.H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)

    Article  Google Scholar 

  30. Yang, Y., Zhai, S., Ramesh, B., Zheng, X., Xiang, C., Chen, B.M., Lee, T.H.: Long-term cooperative tracking using multiple unmanned aerial vehicles. In: IEEE International Conference on Control and Automation (ICCA), pp. 56–61. IEEE (2016)

  31. Zhang, J., Ma, S., Sclaroff, S.: MEEM: robust tracking via multiple experts using entropy minimization. In: European Conference on Computer Vision, pp. 188–203. Springer (2014)

  32. Zhang, K., Zhang, L., Yang, M.H.: Real-time compressive tracking. In: Proceedings of European Conference on Computer Vision, Springer International Publishing, pp. 864–877 (2012)

  33. Zhang, K., Zhang, L., Liu, Q., Zhang, D., Yang, M.H.: Fast visual tracking via dense spatio-temporal context learning. In: Proceedings of European Conference on Computer Vision, pp. 127–141. Springer (2014)

Download references

Acknowledgements

This study was funded by Ministry of Defence (Grant No. TDSI/13-004/1A).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bharath Ramesh.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zheng, X., Ramesh, B., Gao, Z. et al. A novel framework for robust long-term object tracking in real-time. Machine Vision and Applications 30, 529–539 (2019). https://doi.org/10.1007/s00138-018-0992-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-018-0992-1

Keywords

Navigation