Abstract
Automatic tracking of cells is a widely studied problem in various biomedical applications. Although there are numerous approaches for the video object tracking task in different contexts, the performance of these methods depends on many factors regarding the specific application they are used for. This paper presents a comparative study that specifically targets cell tracking problem and compares performance behavior of the recent algorithms. We propose a framework for the performance evaluation of the tracking algorithms and compare several state-of-the-art object tracking approaches on an extensive time-lapse inverted microscopy dataset. We report the quantitative evaluations of the algorithms based on success rate and precision performance metrics.



Similar content being viewed by others
Notes
FCT Source Code: http://www4.comp.polyu.edu.hk/~cslzhang/FCT/FCT.htm.
IVT Source Code: http://www.cs.toronto.edu/~dross/ivt/.
KCF Source Code: https://github.com/vojirt/kcf.
L1APG Source Code: https://github.com/lukacu/visual-tracking-matlab/tree/master/l1apg.
MILTrack Source Code: https://github.com/lukacu/mil.
MOSSE Source Code: https://github.com/albertoQD/tracking-mosse.
ODFS Source Code: http://www4.comp.polyu.edu.hk/~cslzhang/ODFS/ODFS.htm.
SRDCF Source Code: https://www.cvl.isy.liu.se/en/research/objrec/visualtracking/regvistrack/.
Staple Source Code: https://github.com/bertinetto/staple.
LADCF Source Code: http://www.votchallenge.net/vot2018/trackers.html.
GOTURN Source Code: https://github.com/foolwood/GOTURN_matconvnet.
References
Jacob, R.J., Karn, K.S.: Eye tracking in human–computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)
Majaranta, P., Bulling, A.: Eye tracking and eye-based human–computer interaction. In: Gilleade, K. (ed.) Advances in Physiological Computing, pp. 39–65. Springer, London (2014)
Sotelo, M.A., Rodriguez, F.J., Magdalena, L., Bergasa, L.M., Boquete, L.: A color vision-based lane tracking system for autonomous driving on unmarked roads. Auton. Robots 16(1), 95–116 (2004)
Petrovskaya, A., Sebastian, T.: Model based vehicle detection and tracking for autonomous urban driving. Auton. Robots 26(2–3), 123–139 (2009)
Habiboglu, Y.H., Gunay, O., Cetin, A.E.: Real-time wildfire detection using correlation descriptors. In: 19th European Signal Processing Conference, 2011, pp. 894–898. IEEE
Cao, C,. Li, C., Sun Y.: Motion tracking in medical images. In: Biomedical Image Understanding, Chap. 7, pp. 229–274. Wiley (2015). https://doi.org/10.1002/9781118715321.ch7
Li, D., Winfield, D., Parkhurst, D.J.: Starburst: a hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops. CVPR Workshops, pp. 79–79. IEEE (2005)
Mehrubeoglu, M., Pham, L.M., Le, H.T., Muddu, R., Ryu, D.: Real-time eye tracking using a smart camera. In: 2011 IEEE Applied Imagery Pattern Recognition Workshop (AIPR), Oct 2011, pp. 1–7
Maška, M., Ulman, V., Svoboda, D., Matula, P., Matula, P., Ederra, C., Urbiola, A., España, T., Venkatesan, S., Balak, D.M.W., et al.: A benchmark for comparison of cell tracking algorithms. Bioinformatics 30(11), 1609–1617 (2014)
Wu, Y., Lim, J., Yang, M.-H.: Online object tracking: a benchmark. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2013, pp. 2411–2418
Li, K., Miller, E.D., Chen, M., Kanade, T., Weiss, L.E., Campbell, P.G.: Cell population tracking and lineage construction with spatiotemporal context. Med. Image Anal. 12(5), 546–566 (2008)
Friedl, P., Gilmour, D.: Collective cell migration in morphogenesis, regeneration and cancer. Nat. Rev. Mol. Cell Biol. 10(7), 445–457 (2009)
Friedl, P., Alexander, S.: Cancer invasion and the microenvironment: plasticity and reciprocity. Cell 147(5), 992–1009 (2011)
Jiang, R.M., Crookes, D., Luo, N., Davidson, M.W.: Live-cell tracking using sift features in dic microscopic videos. IEEE Trans. Biomed. Eng. 57(9), 2219–2228 (2010)
Gerlich, D., Mattes, J., Eils, R.: Quantitative motion analysis and visualization of cellular structures. Methods 29(1), 3–13 (2003)
Debeir, O., Van Ham, P., Kiss, R., Decaestecker, C.: Tracking of migrating cells under phase-contrast video microscopy with combined mean-shift processes. IEEE Trans. Med. Imaging 24(6), 697–711 (2005)
Dunn, G.A., Jones, G.E.: Cell motility under the microscope: Vorsprung durch technik. Nat. Rev. Mol. Cell Biol. 5(8), 667 (2004)
Ray, N., Acton, S.T.: Motion gradient vector flow: an external force for tracking rolling leukocytes with shape and size constrained active contours. IEEE Trans. Med. Imaging 23(12), 1466–1478 (2004)
Sato, Y., Chen, J., Zoroofi, R.A., Harada, N., Tamura, S., Shiga, T.: Automatic extraction and measurement of leukocyte motion in microvessels using spatiotemporal image analysis. IEEE Trans. Biomed. Eng. 44(4), 225–236 (1997)
Hand, A.J., Sun, T., Barber, D.C., Hose, D.R., MacNeil, S.: Automated tracking of migrating cells in phase-contrast video microscopy sequences using image registration. J. Microsc. 234(1), 62–79 (2009)
Meijering, E., Dzyubachyk, O., Smal, I., et al.: 9 Methods for cell and particle tracking. Methods Enzymol 504(9), 183–200 (2012)
Chenouard, N., Smal, I., De Chaumont, F., Maška, M., Sbalzarini, I.F., Gong, Y., Cardinale, J., Carthel, C., Coraluppi, S., Winter, M., et al.: Objective comparison of particle tracking methods. Nat. Methods 11(3), 281–289 (2014)
Piccinini, F., Kiss, A., Horvath, P.: Celltracker (not only) for dummies. Bioinformatics 32(6), 955–957 (2016)
Demir, H.S., Cetin, A.E.: Co-difference based object tracking algorithm for infrared videos. In: IEEE International Conference on Image Processing (ICIP), Sept 2016, pp. 434–438
Porikli, F., Kocak, T.: Robust license plate detection using covariance descriptor in a neural network framework. In: IEEE International Conference on Video and Signal Based Surveillance, 2006. AVSS ’06, Nov 2006, pp. 107–107
Faraki, M., Harandi, M.T., Porikli, F.: Approximate infinite-dimensional region covariance descriptors for image classification. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Apr 2015, pp. 1364–1368
Porikli, F., Tuzel, O., Meer, P.: Covariance tracking using model update based on lie algebra. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 2006, vol. 1, pp. 728–735
Tuna, H., Onaran, I., Cetin, A.E.: Image description using a multiplier-less operator. IEEE Signal Process. Lett. 16(9), 751–753 (2009)
Suhre, A., Keskin, F., Ersahin, T., Cetin-Atalay, R., Ansari, R., Cetin, A.E.: A multiplication-free framework for signal processing and applications in biomedical image analysis. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 2013, pp. 1123–1127
Danelljan, M., Häger, G., Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. British Machine Vision Conference, Nottingham, 1–5 Sept 2014. BMVA Press
Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2010, pp. 2544–2550
Zhang, K., Zhang, L., Yang, M.-H.: Fast compressive tracking. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2002–2015 (2014)
Ross, D.A., Lim, J., Lin, R.-S., Yang, M.-H.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1), 125–141 (2007)
Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37, 583–596 (2015)
Bao, C., Wu, Y., Ling, H., Ji, H.: Real time robust l1 tracker using accelerated proximal gradient approach. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2012, pp. 1830–1837
Babenko, B., Yang, M.-H., Belongie, S.: Visual tracking with online multiple instance learning. In: IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2009, June 2009, pp. 983–990
Zhang, K., Zhang, L., Yang, M.-H.: Real-time object tracking via online discriminative feature selection. IEEE Trans. Image Process. 22(12), 4664–4677 (2013)
Danelljan, M., Hger, G., Khan, F.S., Felsberg, M.: Learning spatially regularized correlation filters for visual tracking. In: IEEE International Conference on Computer Vision (ICCV), Dec 2015, pp. 4310–4318
Bertinetto, L., Valmadre, J., Golodetz, S., Miksik, O., Torr, P.H.S.: Staple: complementary learners for real-time tracking. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2016
Gundogdu, E., Ozkan, H., Demir, H.S., Ergezer, H., Akagunduz, E., Pakin, S.K.: Comparison of infrared and visible imagery for object tracking: toward trackers with superior ir performance. In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), June 2015, pp. 1–9
Xu, T., Feng, Z.-H., Wu, X.-J., Kittler, J.: Learning adaptive discriminative correlation filters via temporal consistency preserving spatial feature selection for robust visual tracking. arXiv preprint arXiv:1807.11348 (2018)
Held, D., Thrun, S., Savarese, S.: Learning to track at 100 fps with deep regression networks. In: European Conference Computer Vision (ECCV), 2016
Nikon Instruments: Cell motility. https://www.microscopyu.com/galleries/cell-motility, 2016, Nikon’s MicroscopyU. Accessed 02 June 2017
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Demir, H.S., Cetin, A.E. & Cetin Atalay, R. A visual object tracking benchmark for cell motility in time-lapse imaging. SIViP 13, 1063–1070 (2019). https://doi.org/10.1007/s11760-019-01443-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11760-019-01443-2