Skip to main content

Robust Object Tracking Based on Deep Feature and Dual Classifier Trained with Hard Samples

  • Conference paper
  • First Online:
Book cover Advances in Neural Networks – ISNN 2019 (ISNN 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11555))

Included in the following conference series:

  • 1814 Accesses

Abstract

Visual tracking has attracted more and more attention in recent years. In this paper, we proposed a novel tracker that is composed of a feature network, a dual classifier, a target location module, and a sample collecting and pooling module. The dual classifier contains two classifiers, called long-term classifier and short-term classifier, in which the long-term classifier is to maintain the long-term appearance of the target and the short-term classifier is for prompt response to the sudden change of the target. The training samples are divided into positive samples, negative samples, hard positive samples and hard negative samples and are used to train the two classifiers, respectively. Furthermore, in order to overcome the unreliability in locating the target by highest score, a density clustering method is introduced into the target locating process. Experimental results conducted on two benchmark datasets demonstrate the effectiveness of the proposed tracking method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)

    Google Scholar 

  2. Nam, H., Baek, M., Han, B.: Modeling and propagating CNNs in a tree structure for visual tracking. arXiv:1608.07242 (2016)

  3. Han, B., Sim, J., Adam, H.: Branchout: regularization for online ensemble tracking with convolutional neural networks. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 3356–3365. IEEE Press, Honolulu (2017)

    Google Scholar 

  4. Hong, Z., Chen, Z., Wang, C., et al.: MUlti-store tracker (MUSTer): a cognitive psychology inspired approach to object tracking. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 749–758. IEEE Press, Boston (2015)

    Google Scholar 

  5. Wu, Y., Lim, J., Yang, M.H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)

    Google Scholar 

  6. Kristan, M.: The visual object tracking VOT2016 challenge results. In: Hua, G., Jégou, H. (eds.) ECCV 2016. LNCS, vol. 9914, pp. 777–823. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-48881-3_54

    Google Scholar 

  7. Jia, Y., Shelhamer, E., et al.: Caffe: convolutional architecture for fast feature embedding. In: ICMM 2014, pp. 675–678. ACM, Orlando (2014)

    Google Scholar 

  8. Ester, M., Kriegel, H., Sander, J., Xu, X.: A density-based algorithm for discovering clusters in large spatial databases with noise. In: KDD-1996, pp. 226–231. AAAI Press, Portland (1996)

    Google Scholar 

  9. Song, Y., Ma, C., et al.: Crest: convolutional residual learning for visual tracking. In: ICCV 2017, pp. 2555–2564. IEEE Press, Venice (2017)

    Google Scholar 

  10. Deng, J., Dong, W., et al.: Imagenet: a large-scale hierarchical image database. In: CVPR 2009, pp. 248–255. IEEE Press, Miami (2009)

    Google Scholar 

  11. Nam, H., Han, B.: Learning multi-domain convolutional neural networks for visual tracking. In: CVPR 2016, pp. 4293–4302. IEEE Press, Las Vegas (2016)

    Google Scholar 

  12. Zhang, T., Xu, C., Yang, M. H.: Multi-task correlation particle filter for robust object tracking. In: CVPR 2017, pp. 4819–4827. IEEE Press, Honolulu (2017)

    Google Scholar 

  13. Fan, H., Ling, H.: Parallel tracking and verifying: a framework for real-time and high accuracy visual tracking. In: ICCV 2017, pp. 5486–5494. IEEE Press, Venice (2017)

    Google Scholar 

  14. Danelljan, M., Häger, G., et al.: Discriminative scale space tracking. IEEE Trans. Pattern Anal. Mach. Intell. 39(8), 1561–1575 (2017)

    Google Scholar 

  15. Tao, R., Gavves, E., Smeulders, A.W.M.: Siamese instance search for tracking. In: CVPR 2016, pp. 1420–1429. IEEE Press, Las Vegas (2016)

    Google Scholar 

  16. Danelljan, M., Hager, G., et al.: Convolutional features for correlation filter based visual tracking. In: ICCV Workshops, pp. 58–66. IEEE Press, Santiago (2015)

    Google Scholar 

  17. Danelljan, M., Hager, G., et al.: Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking. In: CVPR 2016, pp. 1430–1438. IEEE Press, Las Vegas (2016)

    Google Scholar 

  18. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv:1409.1556 (2014)

  19. Chatfield, K., Simonyan, K., Vedaldi, A., et al.: Return of the devil in the details: delving deep into convolutional nets. arXiv:1405.3531 (2014)

  20. He, K., Zhang, X., et al.: Deep residual learning for image recognition. In: CVPR 2016, pp. 770–778. IEEE Press, Las Vegas (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chengan Guo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xu, B., Guo, C. (2019). Robust Object Tracking Based on Deep Feature and Dual Classifier Trained with Hard Samples. In: Lu, H., Tang, H., Wang, Z. (eds) Advances in Neural Networks – ISNN 2019. ISNN 2019. Lecture Notes in Computer Science(), vol 11555. Springer, Cham. https://doi.org/10.1007/978-3-030-22808-8_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-22808-8_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-22807-1

  • Online ISBN: 978-3-030-22808-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics