Skip to main content
Log in

A fast and accurate moving object tracker in active camera model

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Detecting and tracking moving objects within a scene is an essential step for high-level machine vision applications such as video content analysis. In this paper, we propose a fast and accurate method for tracking an object of interest in a dynamic environment (active camera model). First, we manually select the region of the object of interest and extract three statistical features, namely the mean, the variance and the range of intensity values of the feature points lying inside the selected region. Then, using the motion information of the background’s feature points and k-means clustering algorithm, we calculate camera motion transformation matrix. Based on this matrix, the previous frame is transformed to the current frame’s coordinate system to compensate the impact of camera motion. Afterwards, we detect the regions of moving objects within the scene using our introduced frame difference algorithm. Subsequently, utilizing DBSCAN clustering algorithm, we cluster the feature points of the extracted regions in order to find the distinct moving objects. Finally, we use the same statistical features (the mean, the variance and the range of intensity values) as a template to identify and track the moving object of interest among the detected moving objects. Our approach is simple and straightforward yet robust, accurate and time efficient. Experimental results on various videos show an acceptable performance of our tracker method compared to complex competitors.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Notes

  1. http://www.emgu.com/wiki/index.php/Main_Page

  2. The box that surrounds the whole object of interest and is calculated manually in each and every frame of the test videos.

  3. Best rates are bold faced in the tables.

References

  1. Aggarwal JK, Cai Q (1999) Human motion analysis: a review. Comput Vis Image Underst 73(3):428–440

    Article  Google Scholar 

  2. Akay B, Karaboga D (2015) A survey on the applications of artificial bee colony in signal, image, and video processing. SIViP 9(4):967–990

    Article  Google Scholar 

  3. Argyros AA, Lourakis MI (2004) Real-time tracking of multiple skin-colored objects with a possibly moving camera. In: European conference on computer vision. Springer, pp 368–379

  4. Babenko B, Yang M-H, Belongie S (2009) Visual tracking with online multiple instance learning. In: Conference on computer vision and pattern recognition. IEEE, pp 983–990

  5. Bao C, Wu Y, Ling H, Ji H (2012) Real time robust l1 tracker using accelerated proximal gradient approach. In: Conference on computer vision and pattern recognition. IEEE, pp 1830–1837

  6. Collins RT, Liu Y, Leordeanu M (2005) Online selection of discriminative tracking features. Pattern Anal Mach Intell 27(10):1631–1643

    Article  Google Scholar 

  7. Comaniciu D, Meer P (2002) Mean shift: a robust approach toward feature space analysis. Pattern Anal Mach Intell 24(5):603–619

    Article  Google Scholar 

  8. Comaniciu D, Ramesh V, Meer P (2003) Kernel-based object tracking. Pattern Anal Mach Intell 25(5):564–577

    Article  Google Scholar 

  9. Dalal N, Triggs B (2005) Histograms of oriented gradients for human detection. In: Conference on computer vision and pattern recognition. IEEE, pp 886–893

  10. Doyle DD, Jennings AL, Black JT (2014) Optical flow background estimation for real-time pan/tilt camera object tracking. Measurement 48:195–207

    Article  Google Scholar 

  11. Ester M, Kriegel H-P, Sander J, Xu X (1996) A density-based algorithm for discovering clusters in large spatial databases with noise. In: International conference on knowledge discovery and data minin, vol 96, pp 226–231

  12. Freund Y, Schapire RE (1995) A decision-theoretic generalization of on-line learning and an application to boosting. In: European conference on computational learning theory, pp 23–37

  13. Hare S, Saffari A, Torr PH (2011) Struck: structured output tracking with kernels. In: International conference on computer vision. IEEE, pp 263–270

  14. Haritaoglu I, Harwood D, Davis LS (2000) W4: real-time surveillance of people and their activities. Pattern Anal Mach Intell 22(8):809–830

    Article  Google Scholar 

  15. Harris C, Stephens M (1988) A combined corner and edge detector. In: Proceedings of Alvey vision conference, vol 15, pp 50

  16. He S, Yang Q, Lau RW, Wang J, Yang M-H (2013) Visual tracking via locality sensitive histograms. In: Conference on computer vision and pattern recognition. IEEE, pp 2427–2434

  17. Hsieh Y-S, Su Y-C, Chen L.-G (2012) Robust moving object tracking and trajectory prediction for visual navigation in dynamic environments. In: International conference on consumer electronics. IEEE, pp 696–697

  18. Jung B, Sukhatme GS (2004) Detecting moving objects using a single camera on a mobile robot in an outdoor environment. In: International conference on intelligent autonomous systems, pp 980–987

  19. Jung Y-K, Lee K-W, Ho Y-S (2002) Feature-based object tracking with an active camera. In: International conference on advances in multimedia information processing. Springer, pp 1137–1144

  20. Kalal Z, Mikolajczyk K, Matas J (2012) Tracking-learning-detection. Pattern Anal Mach Intell 34(7):1409–1422

    Article  Google Scholar 

  21. Karamiani A, Farajzadeh N (2015) Optimal feature points for tracking multiple moving objects in active camera model. Multimed Tools Appl. doi:10.1007/s11042-015-2823-y

  22. Kim IS, Choi HS, Yi KM, Choi JY, Kong SG (2010) Intelligent visual surveillance–a survey. Int J Control Autom Syst 8(5):926–939

    Article  Google Scholar 

  23. Kundu A, Jawahar C, Krishna KM (2010) Realtime moving object detection from a freely moving monocular camera. In: International conference on robotics and biomimetics. IEEE, pp 1635–1640

  24. Lefèvre S, Vincent N (2004) Real time multiple object tracking based on active contours. In: International conference on image analysis and recognition. Springer, pp 606–613

  25. Lewis JP (1995) Fast template matching. In: Vision interface, vol 95, pp 15–19

  26. Lim JS, Kim WH (2005) Detection and tracking multiple pedestrians from a moving camera. In: Lecture notes in computer science, vol 3804. Springer, pp 527–534

  27. Lim J, Kim W (2013) Detecting and tracking of multiple pedestrians using motion, color information and the adaboost algorithm. Multimed Tools Appl 65(1):161–179

    Article  Google Scholar 

  28. Lucas BD, Kanade T et al (1981) An iterative image registration technique with an application to stereo vision. In: International joint conference on artificial intelligence, vol 81, pp 674–679

  29. Piccardi M (2004) Background subtraction techniques: a review. In: International conference on systems, man and cybernetics, vol 4. IEEE, pp 3099–3104

  30. Poppe R (2007) Vision-based human motion analysis: an overview. Comput Vis Image Underst 108(1):4–18

    Article  Google Scholar 

  31. Rosenberg Y, Werman M (1998) Real-time object tracking from a moving video camera: a software approach on a pc. In: Workshop on applications of computer vision. IEEE, pp 238–239

  32. Sevilla-Lara L, Learned-Miller E (2012) Distributionfields for tracking. In: Conference on computer vision and pattern recognition. IEEE, pp 1910–1917

  33. Shi J (1994) Good features to track. In: Conference on computer vision and pattern recognition. IEEE, pp 593–600

  34. Siam M, ElSayed R, ElHelw M (2012) On-board multiple target detection and tracking on camera-equipped aerial vehicles. In: International conference on robotics and biomimetics. IEEE, pp 2399–2405

  35. Teng F, Liu Q (2014) Multi-scale ship tracking via random projections. SIViP 8(6):1069–1076

    Article  Google Scholar 

  36. Vivid tracking evaluation web site. http://vision.cse.psu.edu/data/vividEval/datasets/datasets.html. Online; accessed 05-March-2016

  37. Wren CR, Azarbayejani A, Darrell T, Pentland AP (1997) Pfinder: real-time tracking of the human body. Pattern Anal Mach Intell 19(7):780–785

    Article  Google Scholar 

  38. Wu B, Nevatia R (2007) Detection and tracking of multiple, partially occluded humans by bayesian combination of edgelet based part detectors. Int J Comput Vis 75(2):247–266

    Article  Google Scholar 

  39. Wu Y, Lim J, Yang M-H (2013) Online object tracking: a benchmark. In: Conference on computer vision and pattern recognition. IEEE, pp 2411–2418

  40. Xiang G (2009) Real-time follow-up tracking fast moving object with an active camera. In: International congress on image and signal processing. IEEE, pp 1–4

  41. Yilmaz A, Li X, Shah M (2004) Contour-based object tracking with occlusion handling in video acquired using mobile cameras. Pattern Anal Mach Intell 26(11):1531–1536

    Article  Google Scholar 

  42. Yilmaz A, Javed O, Shah M (2006) Object tracking: a survey. ACM Comput Surv 38:4

    Article  Google Scholar 

  43. Zaki M, Youssef M (2009) Tnrac: a system for tracking multiple moving non-rigid objects using an active camera. SIViP 3(2):145–155

    Article  Google Scholar 

  44. Zhao T, Nevatia R (2004) Tracking multiple humans in crowded environment. In: Conference on computer vision and pattern recognition, vol 2. IEEE, pp 406–411

  45. Zhang K, Zhang L, Yang M-H (2012) Real-time compressive tracking. In: European conference on computer vision. Springer, pp 864–877

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nacer Farajzadeh.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Farajzadeh, N., Karamiani, A. & Hashemzadeh, M. A fast and accurate moving object tracker in active camera model. Multimed Tools Appl 77, 6775–6797 (2018). https://doi.org/10.1007/s11042-017-4597-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-017-4597-x

Keywords

Navigation