Skip to main content
Log in

Online single target tracking in WAMI: benchmark and evaluation

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Online single target tracking (OSTT) is a prominent topic in normal surveillance environments for security and transportation applications. However, OSTT comparative analysis is seriously under-investigated in the context of wide area motion imagery (WAMI) although its importance keeps rising with the popularity of the unmanned aerial vehicles. In this work, we make several efforts toward WAMI tracking analysis. First, we propose a new WAMI OSTT benchmark dataset, named WAMI-226, which consists of 100 image frames and 226 targets. This new benchmark dataset brings together research challenges including low frame rate, low resolution, and low contrast. Second, we evaluate 20 existing online trackers for WAMI tracking scenarios. Third, by combining the basic appearance model, background subtraction and high-order motion (HoM) affinity, we develop a novel normalized cross correlation HoM (NCC-HoM) tracking algorithm for WAMI OSTT. The experimental results show that the proposed NCC-HoM method achieves significant improvements for both target initialization and online tracking. Thus, NCC-HoM serves as a new baseline algorithm for the WAMI-226 benchmark.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. http://cvlab.hanyang.ac.kr/tracker_benchmark/index.html

  2. http://winsty.net/dlt.html

  3. http://www.dabi.temple.edu/~hbling/code/TGPR.htm

  4. http://winsty.net/tracker_diagnose.html

  5. http://www.robots.ox.ac.uk/~joao/circulant/index.html

  6. https://github.com/ihpdep/rpt

  7. http://www.cvl.isy.liu.se/research/objrec/visualtracking/regvistrack/index.html

  8. https://sites.google.com/site/chaoma99/hcft-tracking

  9. http://kaihuazhang.net/

References

  1. Adam A, Rivlin E, Shimshoni I (2006) Robust fragments-based tracking using the integral histogram. In: CVPR, pp 798–805

  2. Avidan S (2004) Support vector tracking. IEEE Trans Pattern Anal Mach Intell 26(8):1064–1072

    Article  Google Scholar 

  3. Avidan S (2007) Ensemble tracking. IEEE Trans Pattern Anal Mach Intell 29 (2):261–271

    Article  Google Scholar 

  4. Babenko B, Yang M, Belongie SJ (2011) Robust object tracking with online multiple instance learning. IEEE Trans Pattern Anal Mach Intell 33(8):1619–1632

    Article  Google Scholar 

  5. Bao C, Wu Y, Ling H, Ji H (2012) Real time robust 1 tracker using accelerated proximal gradient approach. In: CVPR, pp 1830–1837

  6. Blasch E, Seetharaman G, Suddarth S, Palaniappan K, Chen G, Ling H, Basharat A (2014) Summary of methods in wide-area motion imagery (WAMI). Proc SPIE:9089

  7. Chi Z, Li H, Lu H, Yang M (2017) Dual deep network for visual tracking. IEEE Trans Image Process 26(4):2005–2015

    Article  MathSciNet  Google Scholar 

  8. Collins RT, Liu Y, Leordeanu M (2005) Online selection of discriminative tracking features. IEEE Trans Pattern Anal Mach Intell 27(10):1631–1643

    Article  Google Scholar 

  9. Comaniciu D, Member VR, Meer P (2003) Kernel-based object tracking. IEEE Trans Pattern Anal Mach Intell 25(5):564–575

    Article  Google Scholar 

  10. Danelljan M, Hȧger G, Khan FS, Felsberg M (2015) Learning spatially regularized correlation filters for visual tracking. In: ICCV, pp 4310–4318

  11. Gao J, Ling H, Hu W, Xing J (2014) Transfer learning based visual tracking with gaussian processes regression. In: ECCV, pp 188–203

  12. Grabner H, Bischof H (2006) On-line boosting and vision. In: CVPR, pp 260–267

  13. Hare S, Saffari A, Torr PHS (2011) Struck: structured output tracking with kernels. In: ICCV, pp 263–270

  14. Harris C, Stephens M (1988) A combined corner and edge detector. In: Proceedings of the alvey vision conference, pp 1–6

  15. He S, Yang Q, Lau RWH, Wang J, Yang M (2013) Visual tracking via locality sensitive histograms. In: CVPR, pp 2427–2434

  16. Henriques JF, Caseiro R, Martins P, Batista J (2012) Exploiting the circulant structure of tracking-by-detection with kernels. In: ECCV, pp 702–715

  17. Henriques JF, Caseiro R, Martins P, Batista J (2015) High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 37(3):583–596

    Article  Google Scholar 

  18. Jia X, Lu H, Yang M (2012) Visual tracking via adaptive structural local sparse appearance model. In: CVPR, pp 1822–1829

  19. Kristan M, Matas J, Leonardis A, Felsberg M, Cehovin L, Fernández G, Vojír T, häger G, Nebehay G, Pflugfelder RP (2015) The visual object tracking VOT2015 challenge results. In: ICCVW, pp 564–586

  20. Kwon J, Lee KM (2010) Visual tracking decomposition. In: CVPR, pp 1269–1276

  21. Li X, Hu W, Shen C, Zhang Z, Dick AR, van den Hengel A (2013) A survey of appearance models in visual object tracking. ACM Trans Intell Syst Technol 4(4):58:1–58:48

    Article  Google Scholar 

  22. Li Y, Zhu J, Hoi SCH (2015) Reliable patch trackers: robust visual tracking by exploiting reliable patches. In: CVPR, pp 353–361

  23. Li A, Lin M, Wu Y, Yang M, Yan S (2016) NUS-PRO: a new visual tracking challenge. IEEE Trans Pattern Anal Mach Intell 38(2):335–349

    Article  Google Scholar 

  24. Li P, Wang D, Wang L, Lu H (2018) Deep visual tracking: review and experimental comparison. Pattern Recogn 76:323–338

    Article  Google Scholar 

  25. Liang P, Blasch E, Ling H (2015) Encoding color information for visual tracking: algorithms and benchmark. IEEE Trans Image Process 24(12):5630–5644

    Article  MathSciNet  Google Scholar 

  26. Ling H, Wu Y, Blasch E, Chen G, Lang H, Bai L (2011) Evaluation of visual tracking in extremely low frame rate wide area motion imagery. In: FUSION, pp 1–8

  27. Liu Z, Wang Z, Lu H, Wang D (2017) Online vehicle tracking in aerial imagery. In: International conference on intelligence science and big data engineering, pp 335–345

  28. Lu Y, Wu T, Zhu S (2014) Online object tracking, learning, and parsing with and-or graphs. In: CVPR, pp 3462–3469

  29. Ma C, Huang J, Yang X, Yang M (2015) Hierarchical convolutional features for visual tracking. In: ICCV, pp 3074–3082

  30. Mei X, Ling H (2009) Robust visual tracking using 1 minimization. In: ICCV, pp 1436–1443

  31. Pėrez P, Hue C, Vermaak J, Gangnet M (2002) Color-based probabilistic tracking. In: ECCV, pp 661–675

  32. Prokaj J (2013) Exploitation of wide area motion imagery. Ph.D. thesis University of Southern California

  33. Prokaj J, Duchaineau M, Medioni GG (2011) Inferring tracklets for multi-object tracking. In: CVPRW, pp 37–44

  34. Prokaj J, Medioni GG (2014) Persistent tracking for wide area aerial surveillance. In: CVPR, pp 1186–1193

  35. Reilly V, Idrees H, Shah M (2010) Detection and tracking of large number of targets in wide area surveillance. In: ECCV, pp 186–199

  36. Ross D, Lim J, Lin RS, Yang M (2008) Incremental learning for robust visual tracking. Int J Comput Vis 77(1-3):125–141

    Article  Google Scholar 

  37. Saleemi I, Shah M (2013) Multiframe many-many point correspondence for vehicle tracking in high density wide area aerial videos. Int J Comput Vis 104(2):198–219

    Article  MATH  Google Scholar 

  38. Sevilla-Lara L, Learned-Miller EG (2012) Distribution fields for tracking. In: CVPR, pp 1910–1917

  39. Smeulders AWM, Chu DM, Cucchiara R, Calderara S, Dehghan A, Shah M (2014) Visual tracking: an experimental survey. IEEE Trans Pattern Anal Mach Intell 36(7):1442–1468

    Article  Google Scholar 

  40. Song S, Xiao J (2013) Tracking revisited using RGBD camera: unified benchmark and baselines. In: ICCV, pp 233–240

  41. The Columbus Large Image Format CLIF dataset 2007. https://www.sdms.afrl.af.mil/index.php?collection=clif2007

  42. The Wright-Patterson Air Force Base WPAFB dataset 2007. https://www.sdms.afrl.af.mil/index.php?collection=wpafb2009

  43. Torr PHS, Zisserman A (2000) Mlesac: a new robust estimator with application to estimating image geometry. Comput Vis Image Underst 78:138–156

    Article  Google Scholar 

  44. Uemura T, Lu H, Kim H (2017) Marine organisms tracking and recognizing using yolo. In: EAI International conference on robotic sensor networks

  45. Wang N, Yeung D (2013) Learning a deep compact image representation for visual tracking. In: NIPS, pp 809–817

  46. Wang D, Lu H, Yang M (2013) Online object tracking with sparse prototypes. IEEE Trans Image Process 22(1):314–325

    Article  MathSciNet  MATH  Google Scholar 

  47. Wang N, Wang J, Yeung D (2013) Online robust non-negative dictionary learning for visual tracking. In: ICCV, pp 657–664

  48. Wang L, Ouyang W, Wang X, Lu H (2015) Visual tracking with fully convolutional networks. In: ICCV, pp 3119–3127

  49. Wang N, Shi J, Yeung D, Jia J (2015) Understanding and diagnosing visual tracking systems. In: ICCV, pp 3101–3109

  50. Wu Y, Lim J, Yang M (2013) Online object tracking: a benchmark. In: CVPR, pp 2411–2418

  51. Wu Y, Lim J, Yang M (2015) Object tracking benchmark. IEEE Trans Pattern Anal Mach Intell 37(9):1834–1848

    Article  Google Scholar 

  52. Yilmaz A, Javed O, Shah M (2006) Object tracking: a survey. ACM Comput Surv 38(4):1–45

  53. Zhang T, Ghanem B, Liu S, Ahuja N (2012) Robust visual tracking via multi-task sparse learning. In: CVPR, pp 2042–2049

  54. Zhang J, Ma S, Sclaroff S (2014) MEEM: robust tracking via multiple experts using entropy minimization. In: ECCV, pp 188–203

  55. Zhang K, Zhang L, Yang M (2014) Fast compressive tracking. IEEE Trans Pattern Anal Mach Intell 36(10):2002–2015

    Article  Google Scholar 

  56. Zhang K, Liu Q, Wu Y, Yang M (2016) Robust visual tracking via convolutional networks without training. IEEE Trans Image Process 25(4):1779–1792

    MathSciNet  Google Scholar 

  57. Zhong W, Lu H, Yang M (2012) Robust object tracking via sparsity-based collaborative model. In: CVPR, pp 1838–1845

Download references

Acknowledgements

This work is supported in part by US NSF Grants 1618398, IIS-1449860 and IIS-1350521.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dong Wang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, D., Yi, M., Yang, F. et al. Online single target tracking in WAMI: benchmark and evaluation. Multimed Tools Appl 77, 10939–10960 (2018). https://doi.org/10.1007/s11042-018-5666-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-018-5666-5

Keywords

Navigation