Abstract
Because of the well-known merits of non-parametric estimation and fast mode matching, the mean shift tracking algorithm has been proposed with demonstrated success. But the traditional mean shift utilizes center-weighted color histogram as the reference model which is susceptible to interference of background pixels, and that can result in the compromised tracking robustness. In order to solve this problem, we propose a midlevel cues mean shift visual tracking algorithm based on target-background confidence map saliency-weighted model. A discriminative appearance model based on superpixels is introduced, thereby it can facilitate a tracker to distinguish between target and background by different weights. This improved mean shift tracker is formulated by computing a target-background saliency confidence map and mean shift iteration, and then we can obtain the position in next frame. Experimental results demonstrate that the improved mean shift tracker is able to handle occlusion and recover it from tracking drifts, furthermore, the improved algorithm facilitates foreground object segmentation during tracking.








Similar content being viewed by others
References
Achanta R, Shaji A, Smith K et al (2010) Slic superpixels. Technical Report, EPFL
Achanta R, Shaji A, Smith K et al (2012) SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans Pattern Anal Mach Intell 34(11):2274–2282
An XW, Kim J, Han YJ (2014) Optimal colour-based mean shift algorithm for tracking objects. IET Comput Vis 8(3):235–244
Cheng MM, Zhang GX, Mitra NJ et al (2011) Global contrast based salient region detect. CVPR:409–416
Dulai A, Stathaki T (2014) Mean shift tracking through scale and occlusion. IET Sign Process 6(5):534–540
Kwon J, Lee K (2010) Visual tracking decomposition. Proc IEEE Comput Soc Conf Comput Vision Pattern Recogn: 1269–1276
Leichter I (2012) Mean shift trackers with cross-bin metrics. IEEE Trans Pattern Anal Mach Intell 4:695–706
Liu HP, Wang LY, Sun FC (2014) Mean-shift tracking using fuzzy coding histogram. Int J Fuzzy Syst 16(4):457–467
Liu YM, Zhou SB (2014) A self-adaptive edge matching method based on mean shift and its application in video tracking. Imag Sci J 62(4):206–216
Lucena M, Fuertes JM, De La Blanca Pérez N et al (2010) Tracking people in video sequences using multiple models. Multimed Tools Appl 49(2):371–403
Wang S, Lu HC, Yang F et al (2011) Superpixel tracking. Proc IEEE Int Conf Comput Vision: 1323–1330
Wang LF, Yan HP, Wu HY et al (2013) Forward-backward mean-shift for visual tracking with local-background- weighted histogram. IEEE Trans Intell Transp Syst 14(3):1480–1489
Wu Y, Lim J, Yang MH (2013) Online object tracking: a benchmark. Proc IEEE Comput Soc Conf Comput Vision Pattern Recogn: 2411–2418
Yu W, Tian XH, Hon ZQ et al (2015) Multi-Scale mean shift tracking. IET Comput Vis 9(1):110–123
Zhang KH, Zhang L, Liu QS et al (2014) Fast visual tracking via dense spatio-temporal context learning. Lect Notes Comput Sci 8693(5):127–141
Zhang KH, Zhang L, Yang MH (2012) Real-time compressive tracking. Lect Notes Comput Sci 7574(3):864–877
Zheng SW, Liu LS, Qiao H (2014) Spatial-temporal saliency feature extraction for robust mean-shift tracker. Lect Notes Comput Sci 8834:191–198
Zhou ZY, Sheng YH, Wang YM et al (2012) Robust target tracking with mean shift algorithm. J Comput Inform Syst 8(3):1333–1340
Zhou ZP, Zhou MZ, Shi XF (2015) Target tracking based on foreground probability. Multimedia Tools and Applications 74(2): in Press
Zhu GK, Wang Q, Yuan Y (2014) Tag-Saliency: Combining bottom-up and top-down information for saliency detection. Comput Vis Image Underst 118:40–49
Zuo JY, Liang Y, Pan Q et al (2010) Camshift tracker based on multiple color distribution models. Acta Automat Sin 34(7):736–742 (in Chinese)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Hu, Z., Xie, R., Wang, M. et al. Midlevel cues mean shift visual tracking algorithm based on target-background saliency confidence map. Multimed Tools Appl 76, 21265–21280 (2017). https://doi.org/10.1007/s11042-016-4068-9
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-016-4068-9