Abstract
Generally, a good tracking system requires a huge computation time to localize, with accuracy, the target object. For real-time tracking applications, the running time is a critical factor. In this paper, a GPU implementation of the chromatic co-occurrence matrices (CCM) tracking system is proposed. Indeed, the descriptors based on CCM help to improve the accuracy of the tracking. However, they require a long computation time. To overcome this limitation, a parallel implementation of these matrices based on GPU is incorporated to the tracker. The developed algorithm is then integrated into an embedded system to build a real-time autonomous embedded tracking system. The experimental results show a speed up of 150% in the GPU version of the tracker compared to the CPU version.
Similar content being viewed by others
References
Elafi, I., Jedra, M., Zahid, N.: Unsupervised detection and tracking of moving objects for video surveillance applications. Pattern Recognit. Lett. 84, 70–77 (2016)
Chua, J.-L., Chang, Y.C., Lim, W.K.: A simple vision-based fall detection technique for indoor video surveillance. Signal Image Video Process. 9(3), 623–633 (2013)
Zhang, S., Zhou, H., Zhang, B., Han, Z., Guo, Y.: “Signal, image and video processing” special issue: semantic representations for social behavior analysis in video surveillance systems. Signal Image Video Process. 8(1), 73–74 (2014)
Abdi, L., Meddeb, A.: In-vehicle augmented reality TSR to improve driving safety and enhance the driver’s experience. Signal Image Video Process. 12(1), 75–82 (2017)
Wang, J., Zhang, L., Zhang, D., Li, K.: An adaptive longitudinal driving assistance system based on driver characteristics. IEEE Trans. Intell. Transp. Syst. 14(1), 1–12 (2013)
Ding, S., Zhai, Q., Li, Y., Zhu, J., Zheng, Y.F., Xuan, D.: Simultaneous body part and motion identification for human-following robots. Pattern Recognit. 50, 118–130 (2016)
Maglietta, R., Milella, A., Caccia, M., Bruzzone, G.: A vision-based system for robotic inspection of marine vessels. Signal Image Video Process. 12(3), 471–478 (2017)
Piccardi, M.: Background subtraction techniques: a review. In: IEEE International Conference on Systems, Man and Cybernetics, vol. 4, pp. 3099–3104. Hague, Netherlands, Oct. 2004
Fang, J., Wang, Q., Yuan, Y.: Part-based online tracking with geometry constraint and attention selection. IEEE Trans. Circuits Syst. Video Technol. 24(5), 854–864 (2014)
Lan, X., Zhang, S., Yuen, P. C.: Robust joint discriminative feature learning for visual tracking. In: the 25th International Joint Conference on Artificial Intelligence, pp. 3403–3410, N.Y, USA, Jul. 2016
Lan, X., Yuen, P. C., Chellappa, R.: Robust MIL-based features template learning for object tracking. In: Proceedings of the thirty-first AAAI conference on artificial intelligence, pp 4118–4125, San Francisco, USA, 2017
Liu, R., Lan, X., Yuen, P. C., Feng, G. C.: Robust visual tracking using dynamic feature weighting based on multiple dictionary learning. In: 24th European Signal Processing Conference (EUSIPCO), pp. 2166–2170, Budapest, Hungary, Aug. 2016
Tian, S., et al.: Multilingual scene character recognition with co-occurrence of histogram of oriented gradients. Pattern Recognit. 51, 125–134 (2016)
Wang, Q., Fang, J., Yuan, Y.: Multi-cue based tracking. Neurocomputing 131, 227–236 (2014)
Arvis, V., Debain, C., Berducat, M., Benassi, A.: Generalization of the co-occurrence matrix for colour images: application to colour texture classification. Image Anal. Stereol. 23(1), 63–72 (2011)
Elafi, I., Jedra, M., Zahid, N.: Tracking occluded objects using chromatic co-occurrence matrices and particle filter. Signal Image Video Process. 12(7), 1227–1235 (2018)
Elafi, I., Jedra, M., Zahid, N.: Fuzzy chromatic co-occurrence matrices for tracking objects. Pattern Anal. Appl. (2018). https://doi.org/10.1007/s10044-018-0726-z
Haralick, R.M., Shanmugam, K., Dinstein, I.: Textural features for image classification. IEEE Trans. Syst. Man Cybern. SMC-3(6), 610–621 (1973)
Dixon, J., Ding, J.: An empirical study of parallel solutions for GLCM calculation of diffraction images. In: 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), p. 3969–3972, Orlando, Florida USA, (2016)
Liu, K.-Y., Li, Y.-H., Li, S., Tang, L., Wang, L.: A new parallel particle filter face tracking method based on heterogeneous system. J. Real-Time Image Process. 7(3), 153–163 (2012)
Laborda, M.A.M., Moreno, E.F.T., del Rincón, J.M., Jaraba, J.E.H.: Real-time GPU color-based segmentation of football players. J. Real-Time Image Process. 7(4), 267–279 (2012)
Gómez-Luna, J., González-Linares, J.M., Benavides, J.I., Guil, N.: An optimized approach to histogram computation on GPU. Mach. Vis. Appl. 24(5), 899–908 (2013)
Franco, J., Bernabé, G., Fernández, J., Ujaldón, M.: The 2D wavelet transform on emerging architectures: GPUs and multicores. J. Real-Time Image Process. 7(3), 145–152 (2012)
Amamra, A., Aouf, N.: GPU-based real-time RGBD data filtering. J. Real-Time Image Process. 14(2), 323–340 (2018)
Doucet, A., Godsill, S., Andrieu, C.: On sequential Monte Carlo sampling methods for Bayesian filtering. Stat. Comput. 10(3), 197–208 (2000)
Arulampalam, M.S., Maskell, S., Gordon, N., Clapp, T.: A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans. Signal Process. 50(2), 174–188 (2002)
Elafi, I., Jedra, M., Zahid, N.: Tracking objects with co-occurrence matrix and particle filter in infrared video sequences. IET Comput. Vis. 12(5), 634–639 (2018)
Øivind, S., Erik, B., Lars, H.: Improved sampling-importance resampling and reduced bias importance sampling. Scand. J. Stat. 30(4), 719–737 (2003)
Elafi, I., Jedra, M., Zahid, N.: A novel particle swarm tracking system based on chromatic co-occurrence matrices. In: 2018 International conference on intelligent systems and computer vision (ISCV), p. 1–8, Fez, Morocco, (2018)
Wu, Y., Lim, J., Yang, M.H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)
“OTB.” Available: http://cvlab.hanyang.ac.kr/tracker_benchmark/. Accessed Jan 2019
Oron, S., Bar-Hillel, A., Levi, D., Avidan, S.: Locally orderless tracking. Int. J. Comput. Vis. 111(2), 213–228 (2014)
Ross, D.A., Lim, J., Lin, R.-S., Yang, M.-H.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1–3), 125–141 (2007)
Zhang, K., Zhang, L., Yang, M.H.: Fast compressive tracking. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2002–2015 (2014)
Sevilla-Lara, L., Learned-Miller, E.: Distribution fields for tracking. In: IEEE conference on computer vision and pattern recognition, pp. 1910–1917, Providence, Rhode Island. (2012)
Dinh, T. B., Vo, N., Medioni, G.: Context tracker: exploring supporters and distracters in unconstrained environments. In: IEEE conference on computer vision and pattern recognition (CVPR), pp. 1177–1184, Colorado Springs, (2011)
Zhong, W., Lu, H., Yang, M. H.: Robust object tracking via sparsity-based collaborative model. In: IEEE conference on computer vision and pattern recognition, pp. 1838–1845, Colorado Springs, (2011)
Henriques, J. F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: European conference on computer vision (ECCV), pp. 702–715, Firenze, Italy, (2012)
Bao, C., Wu, Y., Ling, H., Ji, H.: Real time robust L1 tracker using accelerated proximal gradient approach. In: IEEE conference on computer vision and pattern recognition (CVPR), pp. 1830–1837, Providence, Rhode Island. (2012)
Jia, X., Lu, H., Yang, M. H.: Visual tracking via adaptive structural local sparse appearance model. In: IEEE conference on computer vision and pattern recognition (CVPR), pp. 1822–1829, Providence, Rhode Island. (2012)
Zhang, T., Ghanem, B., Liu, S., Ahuja, N. Robust visual tracking via multi-task sparse learning. In: IEEE conference on computer vision and pattern recognition, pp. 2042–2049, Providence, Rhode Island. (2012)
Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)
Zhang, B., et al.: Output constraint transfer for kernelized correlation filter in tracking. IEEE Trans. Syst. Man Cybern. Syst. 47(4), 693–703 (2017)
PETS. ftp://pets.rdg.ac.uk/pub/PETS2000/. Accessed 07 May 2015
CAVIAR, http://homepages.inf.ed.ac.uk/rbf/CAVIARDATA1/. Accessed 07 May 2015
Acknowledgements
We gratefully acknowledge the support of NVIDIA Corporation with the donation of the Jetson TX1 onboard card used for this research.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Elafi, I., Jedra, M. & Zahid, N. GPU-based chromatic co-occurrence matrices for tracking moving objects. J Real-Time Image Proc 17, 1197–1210 (2020). https://doi.org/10.1007/s11554-019-00874-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11554-019-00874-x