Skip to main content
Log in

A visual object tracking benchmark for cell motility in time-lapse imaging

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Automatic tracking of cells is a widely studied problem in various biomedical applications. Although there are numerous approaches for the video object tracking task in different contexts, the performance of these methods depends on many factors regarding the specific application they are used for. This paper presents a comparative study that specifically targets cell tracking problem and compares performance behavior of the recent algorithms. We propose a framework for the performance evaluation of the tracking algorithms and compare several state-of-the-art object tracking approaches on an extensive time-lapse inverted microscopy dataset. We report the quantitative evaluations of the algorithms based on success rate and precision performance metrics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. DSST Source Code: http://www.cvl.isy.liu.se/en/research/objrec/visualtracking/scalvistrack/DSST_code.zip.

  2. FCT Source Code: http://www4.comp.polyu.edu.hk/~cslzhang/FCT/FCT.htm.

  3. IVT Source Code: http://www.cs.toronto.edu/~dross/ivt/.

  4. KCF Source Code: https://github.com/vojirt/kcf.

  5. L1APG Source Code: https://github.com/lukacu/visual-tracking-matlab/tree/master/l1apg.

  6. MILTrack Source Code: https://github.com/lukacu/mil.

  7. MOSSE Source Code: https://github.com/albertoQD/tracking-mosse.

  8. ODFS Source Code: http://www4.comp.polyu.edu.hk/~cslzhang/ODFS/ODFS.htm.

  9. SRDCF Source Code: https://www.cvl.isy.liu.se/en/research/objrec/visualtracking/regvistrack/.

  10. Staple Source Code: https://github.com/bertinetto/staple.

  11. LADCF Source Code: http://www.votchallenge.net/vot2018/trackers.html.

  12. GOTURN Source Code: https://github.com/foolwood/GOTURN_matconvnet.

References

  1. Jacob, R.J., Karn, K.S.: Eye tracking in human–computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)

    Google Scholar 

  2. Majaranta, P., Bulling, A.: Eye tracking and eye-based human–computer interaction. In: Gilleade, K. (ed.) Advances in Physiological Computing, pp. 39–65. Springer, London (2014)

    Chapter  Google Scholar 

  3. Sotelo, M.A., Rodriguez, F.J., Magdalena, L., Bergasa, L.M., Boquete, L.: A color vision-based lane tracking system for autonomous driving on unmarked roads. Auton. Robots 16(1), 95–116 (2004)

    Article  Google Scholar 

  4. Petrovskaya, A., Sebastian, T.: Model based vehicle detection and tracking for autonomous urban driving. Auton. Robots 26(2–3), 123–139 (2009)

    Article  Google Scholar 

  5. Habiboglu, Y.H., Gunay, O., Cetin, A.E.: Real-time wildfire detection using correlation descriptors. In: 19th European Signal Processing Conference, 2011, pp. 894–898. IEEE

  6. Cao, C,. Li, C., Sun Y.: Motion tracking in medical images. In: Biomedical Image Understanding, Chap. 7, pp. 229–274. Wiley (2015). https://doi.org/10.1002/9781118715321.ch7

  7. Li, D., Winfield, D., Parkhurst, D.J.: Starburst: a hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops. CVPR Workshops, pp. 79–79. IEEE (2005)

  8. Mehrubeoglu, M., Pham, L.M., Le, H.T., Muddu, R., Ryu, D.: Real-time eye tracking using a smart camera. In: 2011 IEEE Applied Imagery Pattern Recognition Workshop (AIPR), Oct 2011, pp. 1–7

  9. Maška, M., Ulman, V., Svoboda, D., Matula, P., Matula, P., Ederra, C., Urbiola, A., España, T., Venkatesan, S., Balak, D.M.W., et al.: A benchmark for comparison of cell tracking algorithms. Bioinformatics 30(11), 1609–1617 (2014)

    Article  Google Scholar 

  10. Wu, Y., Lim, J., Yang, M.-H.: Online object tracking: a benchmark. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2013, pp. 2411–2418

  11. Li, K., Miller, E.D., Chen, M., Kanade, T., Weiss, L.E., Campbell, P.G.: Cell population tracking and lineage construction with spatiotemporal context. Med. Image Anal. 12(5), 546–566 (2008)

    Article  Google Scholar 

  12. Friedl, P., Gilmour, D.: Collective cell migration in morphogenesis, regeneration and cancer. Nat. Rev. Mol. Cell Biol. 10(7), 445–457 (2009)

    Article  Google Scholar 

  13. Friedl, P., Alexander, S.: Cancer invasion and the microenvironment: plasticity and reciprocity. Cell 147(5), 992–1009 (2011)

    Article  Google Scholar 

  14. Jiang, R.M., Crookes, D., Luo, N., Davidson, M.W.: Live-cell tracking using sift features in dic microscopic videos. IEEE Trans. Biomed. Eng. 57(9), 2219–2228 (2010)

    Article  Google Scholar 

  15. Gerlich, D., Mattes, J., Eils, R.: Quantitative motion analysis and visualization of cellular structures. Methods 29(1), 3–13 (2003)

    Article  Google Scholar 

  16. Debeir, O., Van Ham, P., Kiss, R., Decaestecker, C.: Tracking of migrating cells under phase-contrast video microscopy with combined mean-shift processes. IEEE Trans. Med. Imaging 24(6), 697–711 (2005)

    Article  Google Scholar 

  17. Dunn, G.A., Jones, G.E.: Cell motility under the microscope: Vorsprung durch technik. Nat. Rev. Mol. Cell Biol. 5(8), 667 (2004)

    Article  Google Scholar 

  18. Ray, N., Acton, S.T.: Motion gradient vector flow: an external force for tracking rolling leukocytes with shape and size constrained active contours. IEEE Trans. Med. Imaging 23(12), 1466–1478 (2004)

    Article  Google Scholar 

  19. Sato, Y., Chen, J., Zoroofi, R.A., Harada, N., Tamura, S., Shiga, T.: Automatic extraction and measurement of leukocyte motion in microvessels using spatiotemporal image analysis. IEEE Trans. Biomed. Eng. 44(4), 225–236 (1997)

    Article  Google Scholar 

  20. Hand, A.J., Sun, T., Barber, D.C., Hose, D.R., MacNeil, S.: Automated tracking of migrating cells in phase-contrast video microscopy sequences using image registration. J. Microsc. 234(1), 62–79 (2009)

    Article  MathSciNet  Google Scholar 

  21. Meijering, E., Dzyubachyk, O., Smal, I., et al.: 9 Methods for cell and particle tracking. Methods Enzymol 504(9), 183–200 (2012)

    Article  Google Scholar 

  22. Chenouard, N., Smal, I., De Chaumont, F., Maška, M., Sbalzarini, I.F., Gong, Y., Cardinale, J., Carthel, C., Coraluppi, S., Winter, M., et al.: Objective comparison of particle tracking methods. Nat. Methods 11(3), 281–289 (2014)

    Article  Google Scholar 

  23. Piccinini, F., Kiss, A., Horvath, P.: Celltracker (not only) for dummies. Bioinformatics 32(6), 955–957 (2016)

    Article  Google Scholar 

  24. Demir, H.S., Cetin, A.E.: Co-difference based object tracking algorithm for infrared videos. In: IEEE International Conference on Image Processing (ICIP), Sept 2016, pp. 434–438

  25. Porikli, F., Kocak, T.: Robust license plate detection using covariance descriptor in a neural network framework. In: IEEE International Conference on Video and Signal Based Surveillance, 2006. AVSS ’06, Nov 2006, pp. 107–107

  26. Faraki, M., Harandi, M.T., Porikli, F.: Approximate infinite-dimensional region covariance descriptors for image classification. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Apr 2015, pp. 1364–1368

  27. Porikli, F., Tuzel, O., Meer, P.: Covariance tracking using model update based on lie algebra. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 2006, vol. 1, pp. 728–735

  28. Tuna, H., Onaran, I., Cetin, A.E.: Image description using a multiplier-less operator. IEEE Signal Process. Lett. 16(9), 751–753 (2009)

    Article  Google Scholar 

  29. Suhre, A., Keskin, F., Ersahin, T., Cetin-Atalay, R., Ansari, R., Cetin, A.E.: A multiplication-free framework for signal processing and applications in biomedical image analysis. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 2013, pp. 1123–1127

  30. Danelljan, M., Häger, G., Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. British Machine Vision Conference, Nottingham, 1–5 Sept 2014. BMVA Press

  31. Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2010, pp. 2544–2550

  32. Zhang, K., Zhang, L., Yang, M.-H.: Fast compressive tracking. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2002–2015 (2014)

    Article  Google Scholar 

  33. Ross, D.A., Lim, J., Lin, R.-S., Yang, M.-H.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1), 125–141 (2007)

    Google Scholar 

  34. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37, 583–596 (2015)

    Article  Google Scholar 

  35. Bao, C., Wu, Y., Ling, H., Ji, H.: Real time robust l1 tracker using accelerated proximal gradient approach. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2012, pp. 1830–1837

  36. Babenko, B., Yang, M.-H., Belongie, S.: Visual tracking with online multiple instance learning. In: IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2009, June 2009, pp. 983–990

  37. Zhang, K., Zhang, L., Yang, M.-H.: Real-time object tracking via online discriminative feature selection. IEEE Trans. Image Process. 22(12), 4664–4677 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  38. Danelljan, M., Hger, G., Khan, F.S., Felsberg, M.: Learning spatially regularized correlation filters for visual tracking. In: IEEE International Conference on Computer Vision (ICCV), Dec 2015, pp. 4310–4318

  39. Bertinetto, L., Valmadre, J., Golodetz, S., Miksik, O., Torr, P.H.S.: Staple: complementary learners for real-time tracking. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2016

  40. Gundogdu, E., Ozkan, H., Demir, H.S., Ergezer, H., Akagunduz, E., Pakin, S.K.: Comparison of infrared and visible imagery for object tracking: toward trackers with superior ir performance. In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), June 2015, pp. 1–9

  41. Xu, T., Feng, Z.-H., Wu, X.-J., Kittler, J.: Learning adaptive discriminative correlation filters via temporal consistency preserving spatial feature selection for robust visual tracking. arXiv preprint arXiv:1807.11348 (2018)

  42. Held, D., Thrun, S., Savarese, S.: Learning to track at 100 fps with deep regression networks. In: European Conference Computer Vision (ECCV), 2016

  43. Nikon Instruments: Cell motility. https://www.microscopyu.com/galleries/cell-motility, 2016, Nikon’s MicroscopyU. Accessed 02 June 2017

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rengul Cetin Atalay.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Demir, H.S., Cetin, A.E. & Cetin Atalay, R. A visual object tracking benchmark for cell motility in time-lapse imaging. SIViP 13, 1063–1070 (2019). https://doi.org/10.1007/s11760-019-01443-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-019-01443-2

Keywords

Navigation