Skip to main content
Log in

Visual object tracking using Fourier domain phase information

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

In this article, phase of the Fourier transform (FT), which has observed to be a crucial component in image representation, is utilized for visual target tracking. The main aim of the proposed scheme is to reduce the computational complexity of cross-correlation-based matching frameworks. Normalized cross-correlation (NCC) function-based object tracker is converted to a phase minimization problem under the following assumption: In visual object tracking applications, if the frame rate is high, the moving object can be considered to have translational shifts in image domain in a small time window. Since the proposed tracking framework works in the Fourier domain, the translational shifts in the image space are converted to phase variations in the Fourier domain due to the “translational invariance” property of the FT. The proposed algorithm estimates the spatial target position based on the phase information of the target region. The proposed framework uses the \(\ell _1\)-norm and provides a computationally efficient solution for the tracking problem. Experimental studies indicate that the proposed phase-based technique obtain comparable results with baseline tracking algorithms which are computationally more complex.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Yilmaz, A., Javed, O., Shah, M.: Object tracking: a survey. ACM Comput. Surv. 38(4), 13 (2006)

    Article  Google Scholar 

  2. Elgammal, A., Duraiswami, R., Harwood, D., Davis, L.S.: Background and foreground modeling using nonparametric kernel density estimation for visual surveillance. Proc. IEEE 90(7), 1151–1163 (2002)

    Article  Google Scholar 

  3. Clendenin, R., Freeman, R.: Optical target tracking and designating system, US Patent 4,386,848 (1983). https://www.google.com/patents/US4386848

  4. Smith, C.E., Richards, C.A., Brandt, S.A., Papanikolopoulos, N.P.: Visual tracking for intelligent vehicle-highway systems. IEEE Trans. Veh. Technol. 45(4), 744–759 (1996)

    Article  Google Scholar 

  5. Jacob, R., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)

    Google Scholar 

  6. Günther, J., Bongers, A.: 3d motion detection and correction by object tracking in ultrasound images, US Patent 8,348,846 (2013). https://www.google.com/patents/US8348846

  7. Deori, B., Thounaojam, D.M.: A survey on moving object tracking in video. Int. J. Inf. Theory 3(3), 31–46 (2014)

    Google Scholar 

  8. Nishida, K., Kurita, T., Ogiuchi, Y., Higashikubo, M.: Visual tracking algorithm using pixel-pair feature. In: International Conference on Pattern Recognition, pp. 1808–1811 (2010)

  9. Comaniciu, D., Meer, P.: Mean shift: a robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 24(5), 603–619 (2002)

    Article  Google Scholar 

  10. Comaniciu, D., Ramesh, V., Meer, P.: Real-time tracking of non-rigid objects using mean shift. In: Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No. PR00662), Vol. 2, pp. 142–149 (2000)

  11. Bradski, G.R.: Computer vision face tracking for use in a perceptual user interface (1998)

  12. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vision 60(2), 91–110 (2004)

    Article  Google Scholar 

  13. Lee, H., Heo, P.G., Suk, J.-Y., Yeou, B.-Y., Park, H.: Scale-invariant object tracking method using strong corners in the scale domain. Opt. Eng. 48(1), 7204 (2009)

    Google Scholar 

  14. Park, C., Baea, K.-H., Jung, J.-H.: Object recognition in infrared image sequences using scale invariant feature transform. Proc. SPIE 6968, 69681 (2008)

    Google Scholar 

  15. Tuzel, O., Porikli, F., Meer, P.: Region covariance: a fast descriptor for detection and classification. Comput. Vis. ECCV 2006, 589–600 (2006)

    Google Scholar 

  16. Porikli, F., Tuzel, O., Meer, P.: Covariance tracking using model update based on lie algebra. In: 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 1, IEEE, pp. 728–735 (2006)

  17. Cakır, S., Aytaç, T., Yıldırım, A., Gerek, Ö.N.: Classifier-based offline feature selection and evaluation for visual tracking of sea-surface and aerial targets. Opt. Eng. 50(10), 107205–107205 (2011)

    Article  Google Scholar 

  18. Cakir, S., Aytaç, T., Yildirim, A., Beheshti, S., Gerek, Ö.N., Cetin, A.E.: Salient point region covariance descriptor for target tracking. Opt. Eng. 52(2), 027207–027207 (2013)

    Article  Google Scholar 

  19. Wu, Y., Lim, J., Yang, M.-H.: Online object tracking: a benchmark. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 2411–2418 (2013)

  20. Brunelli, R.: Template Matching Techniques in Computer Vision: Theory and Practice. Wiley, New York (2009)

    Book  Google Scholar 

  21. Briechle, K., Hanebeck, U.D.: Template matching using fast normalized cross correlation. Science 4387, 95–102 (2001)

    Google Scholar 

  22. Goshtasby, A., Gage, S.H., Bartholic, J.F.: A two-stage cross correlation approach to template matching. IEEE Trans. Pattern Anal. Mach. Intell. PAMI 6(3), 374–378 (1984)

    Article  Google Scholar 

  23. Lewis, J.P.: Fast normalized cross-correlation. Vis. Interface 10, 120–123 (1995)

    Google Scholar 

  24. Luo, J., Konofagou, E.E.: A fast normalized cross-correlation calculation method for motion estimation. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 57(6), 1347–1357 (2010)

  25. Fouda, Y., Ragab, K.: An efficient implementation of normalized cross-correlation image matching based on pyramid. In: 2013 International Joint Conference on Awareness Science and Technology Ubi-Media Computing (iCAST 2013 UMEDIA 2013), pp. 98–103 (2013)

  26. Ouyang, W., Tombari, F., Mattoccia, S., Stefano, L.D., Cham, W.K.: Performance evaluation of full search equivalent pattern matching algorithms. IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 127–143 (2012)

    Article  Google Scholar 

  27. Demir, H.S., Cetin, A.E.: Co-difference based object tracking algorithm for infrared videos. IEEE Int. Conf. Image Process. 2016, 434–438 (2016)

    Google Scholar 

  28. Kristan, M., Matas, J., Leonardis, A., Vojíř, T., Pflugfelder, R., Fernandez, G., Nebehay, G., Porikli, F., Čehovin, L.: A novel performance evaluation methodology for single-target trackers. IEEE Trans. Pattern Anal. Mach. Intell. 38(11), 2137–2155 (2016)

    Article  Google Scholar 

  29. Kristan, M., Leonardis, A., Matas, J., Felsberg, M., Pflugfelder, R., Čehovin, L., Vojir, T., Häger, G., Lukežič, A., Fernandez, G.: http://www.springer.com/gp/book/9783319488806The visual object tracking vot2016 challenge results, Springer (2016). http://www.springer.com/gp/book/9783319488806

  30. Danelljan, M., Häger, G., Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. In: British Machine Vision Conference, Nottingham, BMVA Press (2014)

  31. Zhang, K., Zhang, L., Yang, M.H.: Fast compressive tracking. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2002–2015 (2014)

    Article  Google Scholar 

  32. Ross, D.A., Lim, J., Lin, R.-S., Yang, M.-H.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1), 125–141 (2008)

    Article  Google Scholar 

  33. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)

    Article  Google Scholar 

  34. Bertinetto, L., Valmadre, J., Golodetz, S., Miksik, O., Torr, P.H.: Staple: Complementary learners for real-time tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1401–1409 (2016)

  35. Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. 2010, 2544–2550 (2010)

    Google Scholar 

  36. Danelljan, M., Hager, G., Shahbaz, K.F., Felsberg, M.: Learning spatially regularized correlation filters for visual tracking. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 4310–4318 (2015)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Serdar Cakir.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cakir, S., Cetin, A.E. Visual object tracking using Fourier domain phase information. SIViP 16, 119–126 (2022). https://doi.org/10.1007/s11760-021-01968-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-021-01968-5

Keywords

Navigation