Abstract
Recently, sparse based learning methods have attracted much attention in robust visual tracking due to their effectiveness and promising tracking results. By representing the target object sparsely, utilising only a few adaptive dictionary templates, in this paper, we introduce a new particle filter based tracking method, in which we aim to capture the underlying structure among the particle samples using the proposed similarity graph in a Laplacian group sparse framework, such that the tracking results can be improved. Furthermore, in our tracker, particles contribute with different probabilities in the tracking result with respect to their relative positions in a given frame in regard to the current target object location. In addition, since the new target object can be well modelled by the most recent tracking results, we prefer to utilise the particle samples that are highly associated to the preceding tracking results. We demonstrate that the proposed formulation can be efficiently solved using the Accelerated Proximal method with just a small number of iterations. The proposed approach has been extensively evaluated on 12 challenging video sequences. Experimental results compared to the state-of-the-art methods demonstrate the merits of the proposed tracker.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
\(m=1024-dim\) Gray scale based features.
- 2.
\(\mu _{k}\) is conventionally set to \(\frac{2}{k+1}\).
- 3.
We denote \(\widetilde{c}_{i}\) as a discriminative feature, then we build a similarity graph by considering each point as a vertex and assigning the connection weight between the node \(i\) and \(j\) as \(\left| \widetilde{c}_{ij} \right| \).
- 4.
We consider every \(n=5\) frames.
- 5.
This is the element-wise product of two (\(1 \times N\)) matrices. \(N\) is the number of sampled particles.
- 6.
References
Mei, X., Ling, H.: Robust visual tracking and vehicle classification via sparse representation. IEEE Trans. Pattern Anal. Mach. Intell. 33, 2259–2272 (2011)
Mei, X., Ling, H., Wu, Y., Blasch, E., Bai, L.: Minimum error bounded efficient ? 1 tracker with occlusion detection. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1257–1264. IEEE (2011)
Black, M.J., Jepson, A.D.: Eigentracking: robust matching and tracking of articulated objects using a view-based representation. In: Buxton, B., Cipolla, R. (eds.) ECCV 1996. LNCS, vol. 1064, pp. 329–342. Springer, Heidelberg (1996)
Comaniciu, D., Ramesh, V., Meer, P.: Kernel-based object tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25, 564–577 (2003)
Yang, M., Wu, Y., Hua, G.: Context-aware visual tracking. IEEE Trans. Pattern Anal. Mach. Intell. 31, 1195–1209 (2009)
Adam, A., Rivlin, E.: Robust fragments-based tracking using the integral histogram. In: Computer Vision and Pattern Recognition (CVPR), pp. 798–805 (2006)
Ross, D., Lim, J., Lin, R.S., Yang, M.H.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77, 125–141 (2008)
Kwon, J., Lee, K.M.: Visual tracking decomposition. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1269–1276. IEEE (2010)
Grabner, H., Grabner, M., Bischof, H.: Real-time tracking via online boosting. In: British Machine Vision Conference (BMVC), pp. 47–56 (2006)
Avidan, S.: Ensemble tracking. IEEE Trans. Pattern Anal. Mach. Intell. 29, 261–271 (2007)
Liu, R., Cheng, J., Lu, H.: A robust boosting tracker with minimum error bound in a co-training framework. In: 2009 IEEE 12th International Conference on Computer Vision, pp. 1459–1466. IEEE (2009)
Jiang, N., Liu, W., Wu, Y.: Adaptive and discriminative metric differential tracking. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1161–1168. IEEE (2011)
Babenko, B., Yang, M., Belongie, S.: Visual tracking with online multiple instance learning. In: Computer Vision and Pattern Recognition (CVPR) (2009)
Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Robust visual tracking via multi-task sparse learning. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2042–2049. IEEE (2012)
Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Robust visual tracking via structured multi-task sparse learning. Int. J. Comput. Vis. 101, 367–383 (2013)
Elhamifar, E., Vidal, R.: Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35, 2765–2781 (2013)
Gao, S., Tsang, I.H., Chia, L.T.: Laplacian sparse coding, hypergraph laplacian sparse coding, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35, 92–104 (2013)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)
Efron, B., Hastie, T., Johnstone, I., Tibshirani, R., et al.: Least angle regression. Ann. Stat. 32, 407–499 (2004)
Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Low-rank sparse learning for robust visual tracking. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012, Part VI. LNCS, vol. 7577, pp. 470–484. Springer, Heidelberg (2012)
Hare, S., Saffari, A., Torr, P.H.: Struck: structured output tracking with kernels. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 263–270. IEEE (2011)
Zhang, K., Zhang, L., Yang, M.-H.: Real-time compressive tracking. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012, Part III. LNCS, vol. 7574, pp. 864–877. Springer, Heidelberg (2012)
Babenko, B., Yang, M.H., Belongie, S.: Robust object tracking with online multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 33, 1619–1632 (2011)
Sevilla-Lara, L., Learned-Miller, E.: Distribution fields for tracking. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1910–1917. IEEE (2012)
Kalal, Z., Matas, J., Mikolajczyk, K.: Pn learning: bootstrapping binary classifiers by structural constraints. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 49–56. IEEE (2010)
Jia, X., Lu, H., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1822–1829. IEEE (2012)
Zhong, W., Lu, H., Yang, M.H.: Robust object tracking via sparsity-based collaborative model. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1838–1845. IEEE (2012)
Bao, C., Wu, Y., Ling, H., Ji, H.: Real time robust l1 tracker using accelerated proximal gradient approach. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1830–1837. IEEE (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Supplementary material (mp4 20,153 KB)
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Bozorgtabar, B., Goecke, R. (2015). Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking. In: Cremers, D., Reid, I., Saito, H., Yang, MH. (eds) Computer Vision -- ACCV 2014. ACCV 2014. Lecture Notes in Computer Science(), vol 9007. Springer, Cham. https://doi.org/10.1007/978-3-319-16814-2_37
Download citation
DOI: https://doi.org/10.1007/978-3-319-16814-2_37
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-16813-5
Online ISBN: 978-3-319-16814-2
eBook Packages: Computer ScienceComputer Science (R0)