skip to main content
10.1145/3508546.3508552acmotherconferencesArticle/Chapter ViewAbstractPublication PagesacaiConference Proceedingsconference-collections
research-article

Obscuration Resistant Method For Unmanned Air Vehicle Object Tracking

Authors Info & Claims
Published:25 February 2022Publication History

ABSTRACT

The target occlusion problem is a difficult problem for Unmanned Air Vehicle (UAV) object tracking, especially when there exists similar object interference, which is very likely to cause model drift. To solve this problem, in this paper we propose a novel anti-occlusion strategy: modeling the interfering target in advance and repositioning the target by finding the best match when the target is obscured. In addition, to accurately determine whether the target is occluded and model the interference target at the right time, we also propose judgment conditions for target occlusion and interference target modeling. Experiments demonstrate that the proposed anti-occlusion algorithm has good robustness and accuracy on both Object Tracking Benchmark (OTB100) and UAV Tracking Benchmark (UAV123).

References

  1. Luca Bertinetto, Jack Valmadre, Stuart Golodetz, Ondrej Miksik, and Philip HS Torr. 2016. Staple: Complementary learners for real-time tracking. In Proceedings of the IEEE conference on computer vision and pattern recognition. 1401–1409.Google ScholarGoogle ScholarCross RefCross Ref
  2. David S Bolme, J Ross Beveridge, Bruce A Draper, and Yui Man Lui. 2010. Visual object tracking using adaptive correlation filters. In 2010 IEEE computer society conference on computer vision and pattern recognition. IEEE, 2544–2550.Google ScholarGoogle Scholar
  3. Gary R Bradski. 1998. Real time face and object tracking as a component of a perceptual user interface. In Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV’98 (Cat. No. 98EX201). IEEE, 214–219.Google ScholarGoogle ScholarCross RefCross Ref
  4. Dorin Comaniciu and Peter Meer. 2002. Mean shift: A robust approach toward feature space analysis. IEEE Transactions on pattern analysis and machine intelligence 24, 5(2002), 603–619.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Martin Danelljan, Gustav Häger, Fahad Shahbaz Khan, and Michael Felsberg. 2016. Discriminative scale space tracking. IEEE transactions on pattern analysis and machine intelligence 39, 8(2016), 1561–1575.Google ScholarGoogle Scholar
  6. Martin Danelljan, Gustav Hager, Fahad Shahbaz Khan, and Michael Felsberg. 2015. Learning spatially regularized correlation filters for visual tracking. In Proceedings of the IEEE international conference on computer vision. 4310–4318.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Mingyang Guan, Changyun Wen, Mao Shan, Cheng-Leong Ng, and Ying Zou. 2018. Real-time event-triggered object tracking in the presence of model drift and occlusion. IEEE Transactions on Industrial Electronics 66, 3 (2018), 2054–2065.Google ScholarGoogle ScholarCross RefCross Ref
  8. Joao F Henriques, Rui Caseiro, Pedro Martins, and Jorge Batista. 2012. Exploiting the circulant structure of tracking-by-detection with kernels. In European conference on computer vision. Springer, 702–715.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. João F Henriques, Rui Caseiro, Pedro Martins, and Jorge Batista. 2014. High-speed tracking with kernelized correlation filters. IEEE transactions on pattern analysis and machine intelligence 37, 3(2014), 583–596.Google ScholarGoogle Scholar
  10. Zhibin Hong, Zhe Chen, Chaohui Wang, Xue Mei, Danil Prokhorov, and Dacheng Tao. 2015. Multi-store tracker (muster): A cognitive psychology inspired approach to object tracking. In Proceedings of the IEEE conference on computer vision and pattern recognition. 749–758.Google ScholarGoogle ScholarCross RefCross Ref
  11. Zdenek Kalal, Krystian Mikolajczyk, and Jiri Matas. 2011. Tracking-learning-detection. IEEE transactions on pattern analysis and machine intelligence 34, 7(2011), 1409–1422.Google ScholarGoogle Scholar
  12. Bo Li, Junjie Yan, Wei Wu, Zheng Zhu, and Xiaolin Hu. 2018. High performance visual tracking with siamese region proposal network. In Proceedings of the IEEE conference on computer vision and pattern recognition. 8971–8980.Google ScholarGoogle ScholarCross RefCross Ref
  13. Yiming Li, Changhong Fu, Ziyuan Huang, Yinqiang Zhang, and Jia Pan. 2020. Keyfilter-aware real-time uav object tracking. In 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 193–199.Google ScholarGoogle ScholarCross RefCross Ref
  14. Fuling Lin, Changhong Fu, Yujie He, Fuyu Guo, and Qian Tang. 2020. BiCF: Learning bidirectional incongruity-aware correlation filter for efficient UAV object tracking. In 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2365–2371.Google ScholarGoogle ScholarCross RefCross Ref
  15. Chao Ma, Jia-Bin Huang, Xiaokang Yang, and Ming-Hsuan Yang. 2018. Adaptive correlation filters with long-term and short-term memory for object tracking. International Journal of Computer Vision 126, 8 (2018), 771–796.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Matthias Mueller, Neil Smith, and Bernard Ghanem. 2016. A benchmark and simulator for uav tracking. In European conference on computer vision. Springer, 445–461.Google ScholarGoogle ScholarCross RefCross Ref
  17. Jack Valmadre, Luca Bertinetto, Joao Henriques, Andrea Vedaldi, and Philip HS Torr. 2017. End-to-end representation learning for correlation filter based tracking. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2805–2813.Google ScholarGoogle ScholarCross RefCross Ref
  18. Mengmeng Wang, Yong Liu, and Zeyi Huang. 2017. Large margin object tracking with circulant feature maps. In Proceedings of the IEEE conference on computer vision and pattern recognition. 4021–4029.Google ScholarGoogle ScholarCross RefCross Ref
  19. Yi Wu, Jongwoo Lim, and Ming-Hsuan Yang. 2013. Online object tracking: A benchmark. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2411–2418.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Zheng Zhu, Qiang Wang, Bo Li, Wei Wu, Junjie Yan, and Weiming Hu. 2018. Distractor-aware siamese networks for visual object tracking. In Proceedings of the European Conference on Computer Vision (ECCV). 101–117.Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    ACAI '21: Proceedings of the 2021 4th International Conference on Algorithms, Computing and Artificial Intelligence
    December 2021
    699 pages
    ISBN:9781450385053
    DOI:10.1145/3508546

    Copyright © 2021 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 25 February 2022

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate173of395submissions,44%
  • Article Metrics

    • Downloads (Last 12 months)8
    • Downloads (Last 6 weeks)1

    Other Metrics

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format