Skip to main content

Anti-occlusion Video Target Tracking Based on Double Threshold Judgment

  • Conference paper
  • First Online:
  • 1307 Accesses

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1074))

Abstract

In recent years, great progress has been made in moving target tracking methods, but there is still no effective solution to the problem that affects the tracking accuracy, especially the problem that the target is occluded in the tracking process. To solve this problem, this paper proposes an anti-occlusion video target tracking method based on double threshold judgment, which combines Discriminative Scale Space Tracking (DSST) algorithm and Kalman filtering algorithm, according to the change of correlation coefficient P between adjacent templates. Firstly, the target is tracked by DSST algorithm, and the location information and scale information of the target are obtained. In the process of tracking the target, the occlusion threshold \( T_{1} \) is set to determine whether occlusion occurs. If not, the tracking of DSST algorithm is continued. Otherwise, the Kalman tracking algorithm is invoked to track the target during occlusion. At the same time, according to the change of correlation coefficient, the threshold \( T_{2} \) is set to judge whether the occlusion is over or not. If the occlusion is over, the DSST algorithm is returned. Otherwise, Kalman filter algorithm is used to track the occlusion until the end of the tracking. The simulation results show that this method can effectively solve the problem of tracking moving objects when they are occluded.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Yin, H., Chen, B., Chai, Y., et al.: A survey of vision-based target detection and tracking. J. Autom. 42(10), 1466–1489 (2016)

    MATH  Google Scholar 

  2. Liu, Z., Tang, H., Feng, L.: Implementation of machine vision technology video tracking system based on FPGA. Electron. Technol. 31(9), 41–44 (2018)

    Google Scholar 

  3. Hou, X., Zhang, L.: Saliency detection: a spectral residual approach. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–8. IEEE, Minneapolis (2007)

    Google Scholar 

  4. Bolme, D.S., Beveridge, J.R., Draper, B.A., et al.: Visual object tracking using adaptive correlation filters. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition, San Francisco, pp. 2544–2550. IEEE Computer Society, New York (2010)

    Google Scholar 

  5. Wang, J., Lu, H., Li, X.: Saliency detection via background and foreground seed selection. Neurocomputing 152, 359–368 (2015)

    Article  Google Scholar 

  6. Borji, A., Itti, L.: Exploiting local and global patch rarities for saliency detection. In: Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 478–485. IEEE, Providence (2012)

    Google Scholar 

  7. Achanta, R., Hemami, S., Estrada, F., Susstrunk, S.: Frequency-tuned salient region detection. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1597–1604. IEEE, Miami (2009)

    Google Scholar 

  8. Henriques, J.F., Caseiro, R., Martins, P., et al.: High-Speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583 (2014)

    Article  Google Scholar 

  9. Zhang, J., Sclaroff, S.: Exploiting surroundedness for saliency detection: a boolean map approach. IEEE Trans. Pattern Anal. Mach. Intell. 38(5), 889–902 (2016)

    Article  Google Scholar 

  10. Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20(11), 1254–1259 (1998)

    Article  Google Scholar 

  11. Kalal, Z., Matas, J., Mikolajczyk, K.: Online learning of robust object detectors during unstable tracking. In: Proceedings of 2009 IEEE 12th International Conference on Computer Vision Workshops, pp. 1417–1424 (2009)

    Google Scholar 

  12. Chen, J., Liu, K., Zhan, Y.: Robust multi-person tracking algorithm based on Kalman particle filter framework. Comput. Eng. Des. 10, 2759–2764 (2015)

    Google Scholar 

  13. Weng, S.K., Kuo, C.M., Tu, S.K.: Video object tracking using adaptive Kalman filter. J. Vis. Commun. Image Represent. JVCIR 17(6), 1190–1208 (2006)

    Article  Google Scholar 

  14. Danelljan, M., Hger, G., Khan, F.S., et al.: Accurate scale estimation for robust visual tracking. In: British Machine Vision Conference, pp. 65.1–65.11. British Machine Vision Association, Nottingham (2014)

    Google Scholar 

  15. Wu, G., Lu, W., Gao, G., et al.: Regional deep learning model for visual tracking. Neurocomputing 175(PA), 310–323 (2016)

    Article  Google Scholar 

  16. Valmadre, J., Bertinetto, L., Henriques, J., Vedaldi, A., Torr, P.H.S.: End-to-end representation learning for correlation filter based tracking. In: Conference on Computer Vision and Pattern Recognition, pp. 5000–5008. IEEE, Honolulu (2017)

    Google Scholar 

  17. Meng, X., Zhang, B., Yuan, Q.: Application of adaptive extended Kalman filter in mobile robot location. Appl. Comput. Syst. 24(12), 176–181 (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ying Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, Y., Du, Fc., Xu, K. (2020). Anti-occlusion Video Target Tracking Based on Double Threshold Judgment. In: Liu, Y., Wang, L., Zhao, L., Yu, Z. (eds) Advances in Natural Computation, Fuzzy Systems and Knowledge Discovery. ICNC-FSKD 2019. Advances in Intelligent Systems and Computing, vol 1074. Springer, Cham. https://doi.org/10.1007/978-3-030-32456-8_31

Download citation

Publish with us

Policies and ethics